Direct Server Cabinet Cooling
Data Center HVAC System Design: Servers Contained Within a Cabinet vs. Servers Not Contained Within a Room
A special cabinet that encases servers so only the servers are cooled, not the entire server room.
Item ID: 68
HVAC--Other HVAC Systems
Technical Advisory Group: 2009 HVAC TAG (#2)
Technical Advisory Group: 2013 Information Technology TAG (#8)
Average TAG Rating: 3.15 out of 5
TAG Ranking Date: 10/25/2013
TAG Rating Commentary:
- Very expensive and typically uses MORE ENERGY (if done in addition to air cooling rather than in-place of - which is how it is typically used). This treats the symptom of a hot spot for poor data center planning. This occurs when a rack power density exceeds the data centers cooling capacity (watts/sq-ft) creating a hot spot. This can be avoided by using less dense rack loading. ONLY needed if space constrained which is rare.
- Most likely to be used in a new build out. I would estimate that this technology has limited applicability in the retrofit market because of the reluctance to disrupt an operational data center.
- Practical; issues with water in a data center (end user concerns); costly to implement.
- Lots of resistance to this in the industry. Also, not clear on economics. If the numbers can be made to work has great potential.
- Most direct server cabinet cooling systems are not ET's but some are. We provide incentives for this in new construction projects but not retrofit projects
For ET #158, same technology:
- Practical; issues with water in a data center (end user concerns); costly to implement
- Sounds like a good idea if institutional barriers and high expense can be overcome.
- This is an ET.
- Very difficult to implement in smaller DC's
- These savings are always shown compared to simple CRAC units - but I don't think this is much better than 100% OA cooling in the Pacific NW with really good airflow management
Convection cooling with air is currently the predominant method of heat removal in most data centers. Air handlers force large volumes of cooled air under a raised floor (the deeper the floor, the lower the friction resistance) and up through perforated tiles in front of (or under) computer racks. Fans within the server racks or “blade cages” distribute the cool air across the electronics that radiate heat, perhaps with the help of heat sinks or heat pipes.
In-rack cooling utilizes a dedicated water-cooled fan-coil that is integral with the server rack. The fan is located at the bottom of the rack so the cool air blows up through the server and out the top. Depending on the product, chilled water or refrigerant is used as the cooling medium. Though there are exceptions, the majority of products do not bring liquid into the actual server rack. The air conditioner, with water connections, is housed in an adjacent, but separate enclosure. The equipment at the rack level is still air cooled. This system easily accommodates racks drawing 4-7 kW.
In-rack cooling is a very precise and efficient means of cooling servers in server rooms, providing cooling directly where it is needed without moving a large volume of air, thus saving fan energy. In some products, instead of constant speed fans, a system of sensors monitors temperature and ramps fan speed and water flow up or down accordingly. The idea is to reduce operating costs while improving effectiveness. Products have been available since 2007 and the installed base ranges from small computer rooms, to enterprise data centers, to high density wiring closets.
Baseline Description: Chilled air supplied underfloor through perforated floor panels, ceiling return
Baseline Energy Use: 810 kWh per year per square foot
Convection cooling with air is currently the predominant method of heat removal in most data centers. Air handlers force large volumes of cooled air under a raised floor (the deeper the floor, the lower the friction resistance) and up through perforated tiles in front of (or under) computer racks. Fans within the server racks or “blade cages” distribute the cool air across the electronics that radiate heat, perhaps with the help of heat sinks or heat pipes. The warmed air rises to the ceiling where it is returned to the computer room air handlers to be recooled.
Baseline and energy savings is based on energy use of a "typical" data center as decided as standard by E3T IT TAG team. The energy use of a full data center is 1500 kWh/sf/yr. The baseline for this technology is the HVAC portion of that, which is 54%, or 810 kWh/sf/yr. (WSU EEP, 2013)
Manufacturer's Energy Savings Claims:
"Typical" Savings: 30%
Savings Range: From 10% to 80%
Savings are stated three ways: fan energy saving, chiller energy savings, or both. Manufacturer's estimates of savings could be a representation of one or both of these primary energy users in data center temperature management. For example, "When combined with Motivair Free Cooling chiller annual energy savings up to 93% are possible versus traditional CRAC systems." The savings of 93% of the HVAC energy includes a water side economizer. Also, information from 42U estimates 20% to 50%, and as high as 80% savings, again including an economizer in their maximum energy savings potential. For the purpose of representing comparative savings, only the fan and chiller are used, resulting in a manufacturer's purported savings of 30% (42U, 2013).
Best Estimate of Energy Savings:
"Typical" Savings: 15%
Low and High Energy Savings: 10% to 50%
Energy Savings Reliability: 4 - Extensive Assessment
Cooling equipment is typically controlled to maintain room temperature, with averaging wall stats. The cold air is typically supplied under the floor and the operators need to locate the perforated floor panels to direct the cold air where needed most. This loose control of temperature results in over cooling some racks since the cold air is not directed only to the hot spots. In other words, if the server is not so hot, for a given cfm, the discharge temperature, T2, would be colder than for a 'hot' server, thereby cooling the not-so-hot server more than necessary. If the goal is to only provide enough cooling such that a server is maintained at less than 80 degrees, then only providing enough cold air to maintain that discharge air temperature will save energy. From studies like (Henry Coles, 2010-10-26), we find in-rack cooling provides about 15% energy savings.
The savings depends on many factors. For example, if a data center has hot spots, the central system will over cool the non-hot spots to meet the load of the hot area. Larger data centers will benefit more from this product than smaller data centers. Another factor that will impact savings is the diligence of the data center operators with balancing air flow based on the needs of each rack. Other factors that impact savings include: server loading, fan and chiller equipment efficiency, and ambient temperatures.
Energy Use of Emerging Technology:
688.5 kWh per square foot per year
Energy Use of an Emerging Technology is based upon the following algorithm.
Baseline Energy Use - (Baseline Energy Use * Best Estimate of Energy Savings (either Typical savings OR the high range of savings.))
Potential number of units replaced by this technology:
We have not been able to find accurate data for square footage of data centers in the Northwest. The best, most up-to-date estimate of space in the US we could find is from DataCenterDynamics (DCD, 2014 Pg 4). According to this report, the total "white space" in the US is 109,067,617 sf. To convert to the Northwest, we use a standard of 4% of national data, based on relative population. In this case, the Northwest probably has more than its share of data centers, so we could probably justify a higher number. However, we are not likely to be serving the mega data centers over 100,000 sf. As an initial approximation, we will use 4%, which gives a total floor space of non-mega data centers in the Northwest of 4,362,704 sf.
Regional Technical Potential:
0.53 TWh per year
Regional Technical Potential of an Emerging Technology is calculated as follows:
Baseline Energy Use * Estimate of Energy Savings (either Typical savings OR the high range of savings) * Technical Potential (potential number of units replaced by the Emerging Technology)
Installed first cost per: square foot
Emerging Technology Unit Cost (Equipment Only): $100.00
Emerging Technology Installation Cost (Labor, Disposal, Etc.): $0.00
Baseline Technology Unit Cost (Equipment Only): $70.00
The baseline is no cases around the servers and is estimated at $70/sf, per RS Means. The added cost for the cabinets is estimated to add an additional $30 per sq ft. The $30 is for the cabinets, controls, additional piping, etc. Also, the raised floor may still be preferred so that a leak would not occur over the servers. Installation can occur in stages.
Simple payback, new construction (years): 2.7
Simple payback, retrofit (years): 9.1
Cost Effectiveness is calculated using baseline energy use, best estimate of typical energy savings, and first cost. It does not account for factors such as impacts on O&M costs (which could be significant if product life is greatly extended) or savings of non-electric fuels such as natural gas. Actual overall cost effectiveness could be significantly different based on these other factors.
This ET is for a special cabinet that encases servers so that only the servers are cooled, and not the entire server room. Each server rack receives only as much cooling as needed to maintain the maximum temperature. A few years ago, it was not uncommon to find that IT managers of private server rooms were very averse to 'trying something new', and were willing to accept the inefficiencies of traditional methods in favor of reliability and familiarity. But, with the recognition and support of this technology, and documented energy savings and reliability, the trend is slowly changing.
As server racks get denser and cooling demands rise, this ET makes a good alternative to adding more computer room air conditioning (CRAC) and chiller capacity.
Standard practice is to supply cool air under the floor and direct the cool air up through perforated floor panels that may be strategically located at the hot spots. As the cool air picks up heat, it rises to the ceiling and is returned to the air handler to be recooled.
Available by more than one manufacturer, this concept has been available for about a decade. This product is conducive to retrofit in stages, partitioning the space as the whole room gets upgraded.
If the IT Manager is okay with routing chilled water piping over his servers, there would be the savings of a raised floor. Also, since the air flow is directed through the cabinets, an above ceiling space for return air ductwork might be a little smaller. We estimate that the savings would be about 15-20%, and equipment is usually sized with this safety factor, therefore, we might find engineers comfortable with downsizing the primary equipment saving on first cost of the chilled water system.
End User Drawbacks:
This product allows for tight control of temperatures. Therefore, there are many sensors that need to be monitored and calibrated regularly.
Operations and Maintenance Costs:
Baseline Cost: $0.00
per: square foot per year
Emerging Technology Cost: $2.00
per: square foot per year
These cabinets come with top, side, door or slide out fan trays that need to be maintained. All the primary cooling equipment is the same for this ET and the Baseline. To keep the controls calibrated, estimate about one week, twice a year for a typical 5,000 sf data center. However, it is not expected this will create jobs, but rather be absorbed into the daily routine of the existing personnel. There will be replacement of sensors, actuators and server fans that can easily be replaced by the existing operators. Determining a failed or failing component would be part of the training needed for the on-site operators.
Anticipated Lifespan of Emerging Technology: 20 years
With care, the cabinets and associated controls can last for decades.
Computer Room Air Conditioners (CRAC), such as Leibert Corporation, Data Aire Inc., etc. These units are spaced around the room to supply cold air under the floor, using perforated floor panels to direct the cold air at the hot spots.
Reference and Citations:
Data Center Rack Cooling with Rear-door Heat Exchanger
Energy Efficiency & Renewable Energy
Managing Energy Costs in Data Centers
Rack Cooling Effectiveness in Data Centers and Telecom Central Offices: The Rack Cooling Index (RCI)
American Society of Heating, Refrigerating and Air-Conditioning Engineers
A Comparison of Room-, Row-, and Rack-Based Data Center Cooling Products
Dell Data Center Infrastructure
Demonstration of Alternative Cooling for Rack-Mounted Computer Equipment
Lawrence Berkeley National Laboratory
Improving Data Center Efficiency with Rack or Row Cooling Devices
Energy Efficiency & Renewable Energy
What is the Energy Smart Data Center Project Researching?
Energy Star for Data Centers
Standard Energy Usage Numbers for E3TNW
Washington State University Energy Program
Global Data Center Space 2013