Startup focuses on reliable, efficient cooling for computer servers

Startup focuses on reliable, efficient cooling for computer servers
Timothy Shedd examines a computer equipped with his novel cooling system. Tubes circulate refrigerating fluid through a special heat exchanger (under the X-shaped structure) on the processor that is the biggest heat source in a computer. Credit: David Tenenbaum

In a dark, windy room on the top floor of Engineering Hall on the University of Wisconsin-Madison campus, racks of computers are processing information for a college that relies, like all technical fields, on massive computing power. The noise comes from multiple fans located inside each computer case and from the large air conditioner that drives currents through the room to remove waste heat from the processors.

Equipment and electricity for are a major expense at big computer installations, and Timothy Shedd, an associate professor of mechanical engineering at UW-Madison, uses the room to show a system he has invented that can do the job more efficiently.

In Shedd's system, a pair of translucent plastic tubes enter each computer case. Upon close inspection, a stream of tiny bubbles in the fluid can be seen exiting the case.

Those bubbles—a gas phase of the liquid refrigerant that entered the computer—are removing heat from a single computer. For several reasons, the system is, roughly speaking, 10 times more efficient than the air-conditioning that dominates the server field.

Shedd has spent more than a decade studying and designing computer , and he has started a spinoff business called Ebullient to commercialize his invention, which is covered by patents he's assigned to the Wisconsin Alumni Research Foundation.

Computing power degrades and then fails if chips get too hot, so cooling is a fundamental need in data centers—the warehouses full of racks of computers owned by Amazon, Google and lesser-known firms. Data centers are growing seven to 10 percent a year in the U.S., with the biggest growth in the Midwest, Shedd says.

Cooling costs at these "server farms" are about $2 billion a year, or about 10 percent of the capital cost, and just three companies make and install most of the big cooling systems, Shedd says. "Cooling can make up 50 percent of the annual operating cost, so the cost of cooling can quickly become larger than the capital cost of the computers themselves."

Shedd's technology has two key components: First, a plastic chamber attached to the processor absorbs heat at the point of creation. Second, a network of tubes and a pump carry the heat to the roof, where it is "dumped" to the atmosphere.

Although the system uses air conditioning refrigerant, it omits the compressor, condenser and evaporator—three major parts of the used to cool server farms.

A few innovative computer-cooling systems use water to remove heat, Shedd says. And while water does transfer an immense amount of heat, it makes operators nervous. "If you get a few drops of water on a computer, you are done, so the companies selling that face an awful lot of resistance."

Instead, Shedd chose a refrigerant that would not damage the computer if it spills. Because the refrigerant carries less heat than water, he must ensure that the fluid boils in the chamber atop the chip and condenses back to liquid on the rooftop heat disperser. This "phase change" ramps up the transfer rate without threatening the processor.

Already, his devices have been working for five months nonstop at a group of servers at the UW-Madison College of Engineering.

As become even more numerous, their cooling requirements will only grow. Shedd estimates that his system can cut cooling costs by up to 90 percent, but first he's got to prove that it's safe and reliable. "In talks with the industry, they've stressed that over and over. Our system has a built-in redundancy—instead of one loop of coolant into each server, we have two, and we have duplicate pumps as well."

Explore further

Mechanical engineer investigates passive cooling system for microelectronics

Citation: Startup focuses on reliable, efficient cooling for computer servers (2014, March 20) retrieved 18 October 2019 from
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Feedback to editors

User comments

Mar 21, 2014
Having built similiar systems, the picture in the article does not look like anything else than regular liquid water system. It has no double redundancy loops - everything is in-line - and the plastic pipes are not strong enough to contain a refridgerant which would have to be pressurized to stay liquid at room temperature.

That is a regular commercial liquid cooling system for a PC put inside a server case. It cannot utilize phase-changing properties of the liquid because the cooling blocks are in series, so any significant heat exchange by boiling would fill the last block with gas bubbles and diminish its cooling capacity. The liquid would have to flow fast enough to stay liquid through all blocks and not reach its boiling point within.

Mar 21, 2014
Their choice of refridgerant would be intriguing, because the system cannot be designed to work at a temperature below the dew point for obvious condensation issues, and it shouldn't be designed to operate below ambient temperature because it would needlessly cool the environment and use up energy to pump heat that is not coming from the computers themselves.

The liquid they use should therefore have a boiling point between 30-50 C at or very near to atmospheric pressure if they intend to put it through those kind of plastic pipes, yet it has to be something that isn't corrosive or a solvent to plastic, and is non-conductive even when contaminated.

I don't know of any that fit the bill. Acetone, ether and methyl acetate would work, but they're all solvents and would eat the plastics away.

Apr 03, 2014
All, thanks for your interest in our work. Some clarifications:
1) This is NOT spray cooling. We have some well-referenced publications in that area, but it is not practical for compact cooling systems. We use boiling jets; they are much more efficient and compact.
2) We use R-245fa right now. It nicely meets the requirements for cooling silicon devices
3) The prototype shown is indeed NOT redundant, but we do have fully redundant modules. What is shown is a dual processor, quad GPU system cooled with 0.5 lpm of fluid flow... 1000 W pulled out through a 1/4" tube.
4) Yes, we design the system to encourage evaporation. In this way, we pull out the 1000 W while keeping all of the processors within 3 deg C of each other.
5) This prototype has been running for 1 year now with the GPUs pounding at 100%. We have a second prototype system cooling 20 servers used for medical research. We have been running for 6 months now 24/7. This offsets about 4000 W of cooling from the A/C.

Apr 10, 2014
Yes, we design the system to encourage evaporation. In this way, we pull out the 1000 W while keeping all of the processors within 3 deg C of each other.

As I see it, when you have enough components in-line and the fluid is moving relatively slowly, the first few heat exchangers boil the liquid into gas, and the subsequent heat exchangers become less efficient with gas bubbles flowing through them, because the bubbles are insulating. With enough heat exchangers, the final one would recieve nothing but gas and would perform terribly.

If the flow rate is increased to the point that all heat exchangers are full of liquid, then there's not enough temperature rise in the loop to cause evaporation and all the cooling happens by convection into the liquid.

Mind you, it's pretty trivial to remove 1000 Watts without a phase changing liquid. It's more difficult to cool the liquid itself afterwards.

Apr 10, 2014
We use R-245fa right now.

R-245fa has a boiling point of 15 C at normal pressure, so again, it would have to be pressurized to remain a liquid at room temperature. The picture doesn't look like it could handle any pressure, which still makes me think it's just water.

0.5 lpm of fluid flow... 1000 W pulled out through a 1/4" tube.

You need 5 liters per minute of liquid water through the same tube to remove 1000 W with a maximum dT of 3 degrees C. It's not that much more. That's actually pretty low-end as far as typical liquid cooling systems in personal computers are concerned. They're not limited by the component heat exchangers, but by the liquid-to-air heat exchanger to cool the water.

0.5 lpm of R-245fa would actually remove 2500 Watts if you let it evaporate completely.

Liquid water under low pressure to lower its boiling point would remove about 3700 Watts for the same amount. You can make it boil at 50 C at 1/10 bars.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more