Liquid Submersion Cooling for Data Centers
Server Cooling in Data Centers: Liquid Submersion vs. Air Cooling
A method of cooling data center servers by submerging them in dielectric fluid, resulting in reduced energy use, peak demand, and infrastructure requirements compared with air cooling or liquid pipe-to-point cooling.
Item ID: 452
Sector:
Commercial
Energy System:
HVAC--Other HVAC Systems
Technical Advisory Group: 2013 Information Technology TAG (#8)
Average TAG Rating: 2.46 out of 5
TAG Ranking Date: 10/25/2013
TAG Rating Commentary: - Most likely a limited audience for adoption of this technology. Could be a great complement for reuse of heat in a district or campus type system.
- Emerging tech but is gaining traction; expensive to implement
- Should say ... submerging them in dielectric fluid.... (rather than oil). Also we use the word immersion rather than submersion.
- Not readily available
- Not an ET these have been around over 20 years but there are huge market barriers against this ECM.
- I believe in the energy savings, but no IT manager will want this
Synopsis:
Green Revolution offers submerged cooling for data center servers by placing standard vertical mounted racks horizontally in a container resembling a freezer case. The case is filled with clear, odorless, dielectric mineral oil that cools the vertically mounted servers. The mineral oil is selected due to its high heat transfer properties as well as the fact that it is electrically non-conductive. The mineral oil collects server heat, which is then pumped to a heat exchanger where the heat is rejected. Standard servers from any OEM can be modified for submersion by removing cooling fans, encapsulating hard drives, and substituting indium foil for thermal grease on chip heat sinks. This is not an endorsement for Green Revolution, but they seem to be the clear leader developing this as of November, 2012.
Liquid cooling has been used in specialized applications, like the Cray supercomputer. Liquid immersion is superior to cooling with conventional air-to-air systems. Tests show liquid submersion uses 85% less cooling energy when compared to air cooling with a 35% reduction in total power costs. In new construction, an immersion system has a reduced footprint and costs about half the price of an air-cooled alternative. Iceotope now offers liquid cooled servers in a modular design.
Peak power requirements are reduced for data centers using immersion systems. With air-cooled systems, power spikes occur in the summer, when electricity rates are highest. Secondary benefits of submerged servers include the complete elimination of: dust contamination; fan maintenance costs; ambient noise from computer room air conditioners; and the overheating of servers from fan failure. Heat produced from the mineral oil is also easily recovered for additional use such as space heating. The immersion system allows for high power density design and does not require hot and cold aisles, chillers, supply and exhaust fans, insulation, or raised decks. The capacity of backup engine/generators and/or UPS systems can also be reduced. Substantial challenges include convincing server manufacturers to retool and IT departments to remove servers from a tub of oil for maintenance.
Baseline Example:
Baseline Description: Fans, chillers, cooling towers, pumps, piping
Baseline Energy Use: 760 kWh per year per square foot
Comments:
The typical baseline design is to supply cold air to an underfloor plenum with strategically located perforated floor panels in the 'cold' isles, and ceiling return grilles in the 'hot aisles'. From Energy Star Portfolio Manager, we find a flat line EUI across the months, at about 400 kBtu/sf/month, (1,406 kWh/year) with about 54% attributable to HVAC and cooling energy. Cooling energy requirements are thus about 760 kWh/sf/year.
Note: Intel has experience operating the Green Revolution cooling system in 2011 and 2012. Researchers found that power usage effectiveness was 1.02 to 1.03 versus a typical PUE for most data centers of 1.6 (E-Source Top 20 Technologies and Trends of 2013).
Manufacturer's Energy Savings Claims:
"Typical" Savings: 40%
Savings Range: From 40% to 60%
Comments:
This ET does not use fans or chillers.
Best Estimate of Energy Savings:
"Typical" Savings: 40%
Low and High Energy Savings: 30% to 50%
Energy Savings Reliability: 5 - Comprehensive Analysis
Comments:
Energy Use of Emerging Technology:
456 kWh per square foot per year
What's this?
Energy Use of an Emerging Technology is based upon the following algorithm.
Baseline Energy Use - (Baseline Energy Use * Best Estimate of Energy Savings (either Typical savings OR the high range of savings.))
Comments:
40% of baseline
Technical Potential:
Units: square foot
Potential number of units replaced by this technology: 4,362,704
Comments:
We have not been able to find accurate data for square footage of data centers in the Northwest. The best, most up-to-date estimate of space in the US we could find is from DataCenterDynamics (DCD, 2014, Pg. 4). According to this report, the total "white space" in the US is 109,067,617 sf. To convert to the Northwest, we use a standard of 4% of national data, based on relative population. In this case, the Northwest probably has more than its share of data centers, so we could probably justify a higher number. However, we are not likely to be serving the mega data centers over 100,000 sf., so we should reduce the number. As a close approximation, we will stick with 4%, which gives a total floor space of non-mega data centers in the Northwest of 4,362,704 sf.
Regional Technical Potential:
1.33 TWh per year
151 aMW
What's this?
Regional Technical Potential of an Emerging Technology is calculated as follows:
Baseline Energy Use * Estimate of Energy Savings (either Typical savings OR the high range of savings) * Technical Potential (potential number of units replaced by the Emerging Technology)
First Cost:
Installed first cost per: square foot
Emerging Technology Unit Cost (Equipment Only): $26.00
Emerging Technology Installation Cost (Labor, Disposal, Etc.): $0.00
Baseline Technology Unit Cost (Equipment Only): $13.00
Comments:
Traditional HVAC is about $4000 per ton. A ton in data centers is good for about 300 sf. 4000/300 = 13. The above price for the ET is an estimate as web pricing is not yet available. Basically, even if this costs twice as much, there is a reasonable payback.
Cost Effectiveness:
Simple payback, new construction (years): 0.5
Simple payback, retrofit (years): 1.0
What's this?
Cost Effectiveness is calculated using baseline energy use, best estimate of typical energy savings, and first cost. It does not account for factors such as impacts on O&M costs (which could be significant if product life is greatly extended) or savings of non-electric fuels such as natural gas. Actual overall cost effectiveness could be significantly different based on these other factors.