Microsoft deploy world’s first underwater datacenter

Microsoft have carried out an innovative research project that has put working datacenters at the bottom of the ocean, leading the way toward more sustainable facilities with faster transmission speeds.

Dubbed Project Natik, the datacenters were put into a sealed container and placed half a mile from the coast. The aim is to lower latency by bringing data closer to populations and could open the door to wave or tidal powered facilities in future. With our demand on the cloud growing incrementally year on year, so too does the demand the submarine transmission networks that carry the data internationally and the power these sites need to run and keep cool. With underwater storage, Microsoft aims to solve all kinds of problems.

Ben Cutler, the project manager at Microsoft Research, their division that focuses on special projects, said: “We take a big whack at big problems, on a short-term basis. We take a look at something from a new angle, a different perspective, with a willingness to challenge conventional wisdom. So when a paper about putting datacenters in the water landed in front of Norm Whitaker, who heads special projects for Microsoft Research NExT, it caught his eye.

“We’re a small group, and we look at moonshot projects,” Whitaker says. The paper came out of ThinkWeek, an event that encourages employees to share ideas that could be transformative to the company. “As we started exploring the space, it started to make more and more sense. We had a mind-bending challenge, but also a chance to push boundaries.”

Natick_out-of-water

The initial idea for the project came from Microsoft employee Sean James, who having served in the Navy for three years had seen complex computing underwater.

“What helped me bridge the gap between datacenters and underwater is that I’d seen how you can put sophisticated electronics under water, and keep it shielded from salt water. It goes through a very rigorous testing and design process. So I knew there was a way to do that.” said James.

Another potential win for the project is deployment time. Building the container took 90 days which a game changer for the industry when compared to the rigorous process to get land facilities built including procurement and construction. Thanks to an array of sensors, Microsoft Research were able to fully control the facility remotely from their Redmond campus.

“The bottom line is that in one day this thing was deployed, hooked up and running. Then everyone is back here, controlling it remotely,” Whitaker says. “A wild ocean adventure turned out to be a regular day at the office.”

Despite initial reservations, Christian Belady, general manager for datacenter strategy, planning and development at Microsoft is encouraged by the results of the test: “While at first I was skeptical with a lot of questions. What were the cost? How do we power? How do we connect? However, at the end of the day, I enjoy seeing people push limits.” Belady says. “The reality is that we always need to be pushing limits and try things out. The learnings we get from this are invaluable and will in some way manifest into future designs.”

The team are currently planning the next stage of the project, aiming to deploy a container four times larger with twenty times the computing power.

(Photos: Microsoft)

 

Copy link