COOL SERVER MAKER Iceotope announced today that Leeds University has become the first to run its fully immersed liquid-cooled server system. To find out how this unusual server system works, we spoke with the firm's CTO Peter Hopton who gave us more detail about the technology involved.
Unveiled at the Cebit trade show in Germany last March, Iceotope's aluminium server modules were manufactured for mass production a month or so later, taking advantage of an inert coolant liquid called Novec made by 3M.
Hopton explained that this liquid is the core technology involved in the servers as it is a non-conductive liquid, which has a low dielectric constant so you can submerge electronics in it.
"As far as the electronics are concerned, [the liquid] is no different to air," Hopton said.
He added that the firm also harnesses other properties of the liquid, including its really high thermal expansivity, so it expands and rises when it is heated, and contracts and falls when it cools.
"We harness that property to take all of the heat from the sever inside of this blade if full up with liquid," Hopton explained.
"This takes all the heat off a server blade and uniformly distributes it across the cold plates, which is one side of the blade, so what we end up with is this large cold plate with the heat load dynamically distributed across it."
Another benefit of this is that the Novec liquid carries heat from the server's motherboard to the module's surface by natural convection, so there is no need for noisy fans to cool it down. Water is pumped to the top of the server, where it then runs down over the modules to a heat exchanger. Another water circuit then carries the heat away.
Hopton emphasised how the University of Leeds has taken advantage of this in a different way, using the heated water in domestic radiators to warm the labs over the winter season.
If one of the server blades failed, Hopton said that another would simply take over.
"Each blade is modular, so a blade can be pulled out and it won't affect the blade next to it," Hopton explained. "Blades all have ‘quick disconnects' on the back of them so they will disconnect any coolant flow to the blade when pulled out. If there's a fault with one blade, it won't affect another."
Because the servers don't breathe any air, there is no need for any air conditioning, making them as about as noisy as a fridge, Hopton said.
"[The server] only needs some water going to it, which can be of any temperature," he added. "All we've done is limit the requirements of fans and eliminate the need for refrigeration, as the system doesn't need it."
As a result, Iceotope's servers are up to 97 percent more efficient in cooling, and up to 20 percent more efficient in power saving, mainly due to the lack of fans.
The typical saving for one of its supercomputer customers is 56 percent for new state of the art equipment, Hopton said.
These calculations were based on the measurements that have been done on the servers over at the University of Leeds and compared to industry research figures of power usage efficiency (PUE) at datacentres within Iceotope's target market.
Aimed at two primary markets - university research computing and cloud computing datacentres - Iceotope said the price of its new submergible servers is completely dependent upon size and requirements.
As for the University of Leeds, it deployed the servers after two years of testing and claims the liquid cooling system uses 80 Watts of power to cool clusters that use 20KW.
According to Hopton, the firm has already generated a lengthy waiting list of companies looking to install its submergible servers. µ
Plus the cost of ambition as moonshots eat into the coffers
Spoiler alert: it's probably VeriSign
Did we say cuts off? We meant traps them inside their own home