HPE debut the latest prototype in The Machine research project

Hewlett Packard Enterprise has debuted the world’s biggest single-memory computer

Hailed as the biggest step so far in their Lab’s The Machine research project, Hewlett Packard Enterprise are promising to accelerate enterprise computing into the cosmos with the world’s largest single-memory computer.

In a blog post, Kirk Bresniker, Chief Architect of The Machine said:

“While computing technology has improved enormously since the Moon landing the fundamental architecture underlying it all hasn’t actually changed much in the last 60 years. And that is quickly becoming a problem. As a computer engineer and researcher, this is the thing that keeps me up at night, the idea that our current technology won’t be able to deliver on our expectations for the future.”

The Machine represents a new architecture specially designed to tackle the challenges of big data computing. The traditional format of computing has a large amount of storage with a smaller amount of memory attached. When working with large files, the necessary chunks would be loaded one at a time into computer’s memory so they could be worked on. However, with the advent of big data, the amount of information to be processed has outstripped our ability to work with it.

In response to this, HPE developed The Machine, which consists entirely of a gigantic bank of memory. The capacity of the new device is 160 terabytes of memory, enough to hold over 160 million books. Entire datasets can be loaded directly into memory and analysed much more quickly and easily. For example, all of Facebook’s user data could uploaded and worked on simultaneously. HPE are describing this revolution in data processing as Memory-Driven Computing.

Mass Scale Encryption

With Big Data often holding sensitive information like medical and financial records, security is a major concern. HPE Labs have made sure that the memory bank is fully encrypted at all times and a hardware firewall provides additional protection.

Even though this prototype has the largest single block of memory in the world available on a single system, HPE have even greater goals in sight. The architecture they developed for The Machine is designed to be scalable, which means they can increase and extend the system to include many times more memory than is available worldwide right now. The modular setup also means that any component like the processing and network capabilities can be upgraded without affecting the rest of the system.

The system is composed of 40 individual nodes, each with their own processor and memory, which is then joined using high speed optical links and a fabric networking setup which allows for very fast communication. Using their custom software, The Machine appears to the end user as one single computer capable of enormous storage and calculation.

HPE Labs

It is not possible to build a computer like this simply using off-the-shelf components and software, meaning HPE Labs have had to reinvent the computational model. Despite traditionally using Microsoft for their server operating systems, HPE have switched to a specially optimised version of Linux OS. Instead of using mainstream Intel processors, an ARM chip from Cavium was selected. Finally, in order for programmers to be able to make use of the vast memory available, new custom programming software was developed.

The Machine represents a bold new vision for HPE in the future of large-scale computation. Memory-driven computing could be a game changer in many different use-cases, from supercomputing to intelligent edge devices, such as connected cars where en-mass data crunching can make all the difference.

During a live broadcast, we spoke to Kirk Bresniker back in December at HPE Discover. While he doesn’t cover the latest announcement, he gives a great explanation of what The Machine means for the future of computational power.