HEWLETT-PACKARD ENTERPRISE (HPE) has unveiled what it claims is the 'world's largest single-memory computer' as part of its "The Machine" research project.
HPE claims the development represents the largest research and development programme in the company's history. The memory-driven computer, the company says, will provide the blueprint for computing in an era of big data, and for storing data in memory for fast retrieval.
The prototype contains 160 terabytes of memory, capable of storing about 160 million books. But based on the prototype, HPE believes that its architecture could be extended to an exabyte-scale single-memory system and, beyond that up to 4,096 yottabytes. For context, that is 250,000 times everything currently held in a digital format today.
The technical specifications includes 160TB of shared memory spread across 40 physical nodes, interconnected using a high-performance fabric protocol. The device itself runs Linux on ThunderX2, Cavium's flagship second-generation dual-socket capable ARMv8-A workload optimised system-on-a-chip.
The device was first aired in November as a proof-of-concept machine. The company says that, while it has no plans to sell it, it will incorporate elements of the technologies it developed to make it into different HPE products in the future.
HPE CEO Meg Whitman suggested that big data, and computers with the memory and computer power to analyse large data-sets in memory, had the potential to generate major technological breakthroughs.
"The secrets to the next great scientific breakthrough, industry-changing innovation, or life-altering technology hide in plain sight behind the mountains of data we create every day," said Whitman. "To realise this promise, we can't rely on the technologies of the past, we need a computer built for the 'big data era'." µ
Presumably 'Richard' is your next security worry
Good news if the kids need a summer job
Welcome back, Zoinkerberg
That's another good reason not to see it