Facebook Shows How to Save $230K in Your Data Center

Kludge up the A/C like you would at your house if you didn't want to fix it right.

By  

Want to hear something amazing? In one 56,000-foot data center, Facebook was able to save $230,000 per year in cooling costs just by changing the way the air flowed across the hot equipment.

Do you know what that means?

tweetmeme_source = 'ITworld';

That whoever laid out the servers and cooling in your data center was probably an idiot. Or at least was not being, um, systematic about the whole process.

A few years ago I did a series of stories on how to build more power-efficient data center and discovered -- from data center and facilities managers -- that most data centers are set up for heating and cooling as if the job were given to a monkey with one of those genetic conditions that keeps it from feeling heat or cold.

Ninety percent of data centers had more cooling capacity than they needed -- an average of 2.6 times more capacity than they needed -- and still had hot spots and under-cooled hardware, according to studies from data-center-optimization consultancy the Uptime Institute.

Of course, there were traditional, formalized ways of laying out a data center to take maximimum advantage of the HVAC.

Know how they did it?

One data center manager would walk around the room with one hand in the air, like someone's "special' brother dowsing for water above the suspended ceiling, until he found a spot that was a little cooler than the others. Then he'd point down and say 'OK; let's put it here. '

Very scientific.

Or they'd install the servers and chassis and call in the facilities people to hook up the cooling ducts to the server chassis -- generally putting the outflow suction valve right next to the cold-air inflow, so the HVAC system could remove the cold air as quickly as it could be pumped into the machine. Or they'd place the blade-server chassis over the one gutter through which cold air would flow, but wouldn't add any fans, so the cold air would continue to flow right past the servers and out the other side, while the hot air stayed up high doing what it was supposed do -- make the servers crash.

Facebook put its servers in a room with air conditioning pumped up through the floor -- the fans are key here; cold likes to stay on the floor if it can -- but it wasn't coolling the machines enough because the vents were all over the place and the servers were all clustered together, not taking full advantage of the vents. 

Join us:
Facebook

Twitter

Pinterest

Tumblr

LinkedIn

Google+

Answers - Powered by ITworld

Join us:
Facebook

Twitter

Pinterest

Tumblr

LinkedIn

Google+

Ask a Question
randomness