Cold aisle containment

from Wikipedia, the free encyclopedia
Data center with cold aisle containment

Under cold aisle containment (also cold aisle containment ) is an action in the data center for the purposes of optimization of the cooling Green IT understood which is obtained by a strict separation of the hot air from the cold air areas. This separation of the cold aisle from the hot aisle is achieved by completely enclosing the racks lining the cold aisle with aluminum profiles , fixed ceiling and wall panels. Alternatively, a curtain made of plastic can also be pulled up over the cold aisle.

Information on the relevance of cold aisle containment can be found in the guidelines for energy-efficient data centers from BITKOM and the 11 best practices for cooling optimization in the data center from Gartner Inc.

Problem

Cold aisle with opposing rack fronts and cold air supply through the base plates, but without housing

The rows of racks in today's data centers are mostly arranged according to the cold and hot aisle principle, i.e. H. the rack fronts face each other in an alleyway, while the backs of the racks form a parallel corridor.

The through the cooling systems, such. B. "Computer Room Air-conditioning-Units" (CRAC-units), generated cooling air is usually directed through the raised floor below the rooms to the outlets in front of the rack fronts. This area is known as the cold aisle. The cool air is supplied with a generation temperature below 18 ° C with a high volume flow. Heat sources and obstacles in the raised floor, such as cable bundles or pipelines, heat the cooling air to approx. 20 to 22 ° C through heat flow and flow friction until it flows into the actual cold aisle at the outlets in front of the racks.

The hardware in the lower rack regions is sufficiently cooled with up to 22 ° C. The hardware sucks in the rising cooling air and releases it as warm air on the back of the rack into the alley known as the hot aisle.

A formal thermal separation is created between the cold and warm area, which is intended to prevent the released warm air from being sucked in again by the hardware and in this way "warming nests" or "hot spots" form, which lead to the hardware and overheating associated system malfunctions or even failures.

However, the hot air flowing back from the hot aisle into the cold aisle, which mixes with the cooling air, especially in the upper areas, creates temperatures of 30 degrees or more there.

construction

There are currently two options for enclosing the cold aisle: In the case of a complete structural enclosure, the open rows of racks are closed by side and ceiling walls that are firmly attached to the rack profiles or via additional supports. Since there should be access to the racks, doors are usually inserted into the walls at the beginning and end of the rack aisle. The material used must have the appropriate certification so that there is no additional fire hazard in the data center. Cold aisle containment can also be implemented using fire protection-certified plastic curtains, which are attached to the ceilings, walls or profiles of the racks using an attachment mechanism.

Since the complete structural enclosure requires a higher investment , the return on investment only occurs later than with the comparatively inexpensive variant realized with curtains.

The efficiency of both variants is almost identical. Measurements have shown that with constant cooling capacity, the difference between the temperatures in the hot and cold aisles is between 10 and 15 degrees Celsius.

Since unoccupied height units in the individual racks also offer a way for the warm air to flow into the cold room area, it is advisable to seal this free space airtight with sealing sheets or blanking panels made of plastic. If this additional seal is not made, the temperature in the cold aisle can increase by up to 5.6 ° C, according to Gartner Inc.

Functions

The cold aisle containment prevents the emitted warm air from flowing back into the cooled area through gaps or by climbing over the rack cabinet. This circulation can lead to thermal feedback that gradually increases the temperature in the cold aisle. This increase in temperature can be compensated for by increasing the performance of the cooling systems, which, however, increases the energy consumption considerably.

According to the expertise of BITKOM, it was found that every decrease in the ambient temperature by one degree in the range between 22 and 27 ° C (the so-called comfort temperature for hardware in data centers) requires an increased cooling capacity of the cooling systems of up to five percent. Conversely, this also means that purely mechanical measures that lead to a lowering of the temperature can reduce the cooling capacity of the systems accordingly.

According to the recommendations of ASHRAE (American Society of Heating, Refrigerating and Air-Conditioning Engineers), the temperature in the cold aisle can be increased to a value between 20 and 25 ° C with a cold aisle containment.

According to a study by the Federal Ministry for the Environment, Nature Conservation and Nuclear Safety and the Borderstep Institute, the cooling capacity of the systems can be reduced by up to 35%, depending on the size and spatial conditions.

Individual evidence

  1. https://www.bitkom.org/Bitkom/Publikationen/Leitfaden-Energiedienstleistungen-in-Rechenzentren.html BITKOM guide: energy efficiency in data centers, 2008
  2. Archived copy ( memento of the original from December 17, 2008 in the Internet Archive ) Info: The archive link was inserted automatically and has not yet been checked. Please check the original and archive link according to the instructions and then remove this notice. Gartner: 11 Best Practices for Increasing Energy Efficiency in the Data Center, 2009 @1@ 2Template: Webachiv / IABot / www.gartner.com
  3. http://www.datacenter-insider.de/themengebiete/physikalisches-umfeld/energieversorgung/articles/154376/Gartner  ( page no longer available , search in web archivesInfo: The link was automatically marked as defective. Please check the link according to the instructions and then remove this notice. : Eleven best practices save one million kilowatt hours in the data center, 2008@1@ 2Template: Toter Link / www.datacenter-insider.de  
  4. Archive link ( Memento of the original dated February 3, 2014 in the Internet Archive ) Info: The archive link was inserted automatically and has not yet been checked. Please check the original and archive link according to the instructions and then remove this notice. BMU and Borderstep Report: Energy-efficient data centers - best practice examples from Europe, USA and Asia, 2008  @1@ 2Template: Webachiv / IABot / www.borderstep.de

Web links