October 19, 2009, 4:27 PM — SOUTHBURY, Conn. -- If you had a blank check, access to IBM's latest products and its best talent, and your task was to renovate a 2,000-square-foot legacy data center, the result would be IBM's sparkling showcase in Southbury, Conn.
Enterprise data center managers may not be able to replicate everything IBM has done in what data center director Peter Guasti calls a "living lab," but there are lessons here for everyone.
The Southbury data center was built for the 1996 Olympics and was then turned over to IBM's CIO Technology and Innovation group to host internal applications. By around 2006, it was simply running out of room.
So, Guasti and his team built a new data center in the existing space in 10 months without any interruption in service. They increased server capacity fourfold, allowing the workload of three other data centers to be shifted to Southbury, while keeping power consumption flat.
The data center supports a variety of internal company functions, including IBM corporate Wikis and blogs, Second Life, Real Time Translation Service, Media Library, and a Technology Adoption Program in which employees working on innovative projects can access data center resources.
Running hot and cold
By now, the concept of hot and cold aisles is pretty well understood: You pump cold air up from the floor into the front of the server racks, suck the hot exhaust out the back and up into the ceiling vents, where it gets fed back into the cooling system and up through the floor. But Guasti and his team wanted to get more granular.
The first step was conducting a complete thermal analysis of the data center, including the areas above the ceiling and below the floors to assess air flows.
In all, 100 temperature sensors were installed, some at the rack level, some in the ceiling. The goal was to identify hot and cold spots, to fix whatever created those areas, and to continually monitor and control temperatures. There's also a separate set of sensors and control systems for water flow and water temperature.
The data center features two, 30-ton Emerson air conditioners. Data from the sensors feeds into a PLC (Programmable Logic Controller) that automatically regulates the air and water temperatures. Efficient, variable-speed fans in the floor make subtle adjustments to keep temperatures at the desired levels.