Are you cool with your data center cool?

Schneider Electric offers simple tips for keeping data centers cool and efficient in a new white paper.

With all the buzz about Facebook kumbaya-ing with Greenpeace (and announcing earlier this month it will collaborate with the environmental group on clean and renewable energy), and more and more companies heading north for chilly climes to help keep their data center operations more efficiently cooled and green, you’d think everyone has figured out the best way to cool the data center. Fact is, there are still plenty of folks operating with what they’ve got and don’t have any near-term plans to make big changes. But small changes still can and do make a difference.

According to Schneider Electric, basic design and configuration flaws are keeping a lot of data centers from achieving their optimal cooling capacity and preventing them delivering cool air where it is needed. Recent increases in power density of newer IT equipment are testing existing data center design limits. The global energy management company says, in a recent white paper, typical mistakes are related to five areas: airflow in the rack itself; the layout of the racks; the distribution of load; and the layout of air delivery and return vents.

The first, airflow in the rack, relates to whether appropriate conditioned air is presented at the equipment air intake and that airflow in and out of equipment is not restricted. According to Schneider Electric, the two key problems that often occur are that the CRAC (or computer room air conditioner) air gets mixed with hot exhaust air before it gets to the equipment air intake and/or the equipment airflow is blocked by obstructions. For the former, the fix is often simple: the use of a blanking panel, which provides a natural barrier that increases the length of air recirculation path and reduced the equipment intake of hot exhaust air.

Interestingly, lots of data centers omit blanking panels, despite recommendations from all major IT equipment manufacturers.

Rack layout is another critical design element that affects cooling and ensures that air of the appropriate temperature and quantity is available at the rack and is designed to separate the hot exhaust air from the equipment intake air (much like blanking panels are designed to do). Schneider Electric says that by placing racks in rows and reversing the direction that alternate rows of racks (the hot-aisle-cold-aisle design), recirculation can be dramatically reduced. But there are folks that still put racks in rows that face in the same direction – a design flaw that causes significant recirculation and will most likely create hot spots.

Load distribution is well-known. The location of loads can stress data center performance and can give rise to hot spots where high-density, high-performance servers are packed into one or more racks. Often to counteract those hot spots, operators lower temperature set points or add CRAC units. Better to spread the load out where feasible.

Finally, the layout of air deliver and return vents is critical. Air conditioning performance is maximized when the CRAC output air temperature is highest, according to Schneider Electric, and in an ideal-world data center with zero recirculation, the CRAC output temperature would be the same 68-77°F (20-25°C) desired for the computer equipment. But this doesn’t happen in the real-world data center. So Schneider Electric suggests that the CRAC set point should now be set lower than what is necessary to maintain desired equipment intake temperatures. Although the CRAC temperature set point is dictated by the design of the air distribution system, the vendor notes, the humidity may be set to any preferred value. Setting humidity to high can detract from the air cooling capacity of the CRAC unit – which will need to power up dehumidification functions that affect its air cooling abilities – and humidifiers will have to be added to replace the water removed from the air by the dehumidification (oh, and humidifiers are a significant source of heat which then needs to be cooled and further detracts from the capacity of the CRAC unit).

Schneider Electric offers a lot of other tips in the white paper, titled Power and Cooling Capacity Management for Data Centers, which can be viewed here.

Insider: How the basic tech behind the Internet works
Join the discussion
Be the first to comment on this article. Our Commenting Policies