Data centre infrastructure is facing a crisis.
As AI workloads push energy demands beyond 40kW per rack, traditional cooling systems that have powered the digital economy for decades are reaching their physical limits, and action must be taken to address this.
A single large-scale data centre consumes as much electricity as 100,000 households. Traditional air-cooling simply cannot keep pace with the thermal demands of next-generation AI hardware. Direct Liquid Cooling is quickly becoming a solution, ensuring organisations around the globe and across industries will have the technological advantages they need to compete and grow in the coming years.
When Efficiency Becomes Survival
The thermal challenge created by AI workloads represents more than an operational headache – it hinders innovation. Systems like NVIDIA’s GB200 and GB300 architectures generate heat that air cooling was never designed to handle. Without adequate thermal management, these systems will see increased failure rates and face increased costs as they try to do more with less.
Liquid cooling systems address this challenge by targeting heat at its source – directly at the chip level – rather than trying to manage it after it radiates into the surrounding environment. This approach isn’t just marginally better than air cooling; it’s fundamentally more effective. In fact, liquid-cooled data centres achieve energy savings of 30.6% under variable environmental temperatures and 42.7% under typical variable loads.
For facilities consuming electricity at the scale of small cities, these efficiency gains translate to millions in annual operational savings. More importantly, they ensure that AI systems can be used widely by companies in an economical way.
Economics Are Inescapable
Global data centre power demand is predicted to grow 165% by 2030, requiring $6.7 trillion to keep pace with demand. Against this backdrop, the necessity for liquid cooling becomes inescapable.
For enterprises running AI workloads above 30-40kW rack densities, investments into liquid cooling systems pay for themselves through operational savings alone – often within months of deployment. This means data centre operators will see an immediate operational advantage after making this shift, while simultaneously enabling higher-density compute deployments. Alternatively, continuing to rely on air-cooling infrastructure means that data centre operators will deal with growing inefficiencies, precisely when energy costs are rising and AI compute demands are accelerating.
Beyond Energy: The Hidden Operational Advantages
While energy efficiency drives initial liquid cooling adoption, the operational benefits extend beyond electricity bills. Traditional air-cooling systems rely heavily on water-intensive chillers and evaporative towers, creating substantial challenges in regions facing water scarcity or stringent environmental regulations.
On the other hand, liquid cooling systems use significantly less water than conventional cooling infrastructure. Some advanced systems operate as completely self-contained units requiring no external water supply. As water resources become increasingly constrained and environmental regulations tighten globally, liquid cooling becomes even more essential.
Liquid cooling systems also transform heat from an operational liability into a potential asset. Rather than expelling thermal energy into the atmosphere at great expense, liquid cooling systems can capture and redirect heat for secondary applications – from district heating systems to industrial processes. This heat recovery capability is already being implemented in Northern European data centres, where captured thermal energy helps heat nearby residential and commercial buildings.
Finally, the operational advantages continue at the facility level. Liquid cooling operates more quietly than traditional fan-based systems and occupies less physical space – critical benefits in urban environments with high real estate costs and strict noise ordinances.
Making the Transition
The convergence of escalating energy costs, mounting environmental pressures and exponential growth in AI workloads has made liquid cooling adoption inevitable. The only question is when data centre operators will make the switch.
The path forward for businesses running high-performance computing and AI infrastructure is clear: invest in liquid cooling now to capture immediate operational savings. If they fail to act, they risk facing mounting inefficiencies as AI demands accelerate.
The data centres that will power the next generation of innovation are being built today. They’re constructed with liquid cooling at their core, not because it’s environmentally preferable, but because it’s the only infrastructure to handle what’s coming next.
About the Author
Vivek Swaminathan is Director of Products and Solutions, Intelligent Data Centre at Unisys. Unisys is a global technology solutions company that powers breakthroughs for the world’s leading organizations. Our solutions – cloud, AI, digital workplace, logistics and enterprise computing – help our clients challenge the status quo and unlock their full potential. To learn how we have been helping clients push what’s possible for more than 150 years, visit unisys.com.


