June 03, 2011, 1:40 PM — When it comes to saving data center energy few companies would have the expertise of Google, which has some of the largest data centers in the world. It has or is experimenting with some interesting ideas to cool data centers more efficiently as well - it will soon open a seawater-cooled data center in Finland for example.
More on energy: 10 hot energy projects that could electrify the world
On its blog this week the company wrote "Google is lucky enough to have the resources and experts to continually improve efficiency. But around 70% of the world's data centers are operated by companies that probably don't."
With that in mind the company laid out what it calls some "simple design choices that you can apply to both small and large data centers to improve the efficiency of the facility. Saving energy will reduce the impact on the environment and also lead to significant financial savings. "
Here are the top five recommended best practices from Google's data center experts:
1. Measure up: You can't manage what you don't measure, so characterize your data center's efficiency performance by measuring energy use. We use a ratio called PUE - Power Usage Effectiveness - to help us reduce energy used for non-computing, like cooling and power distribution. To effectively use PUE it's important to measure often - we sample at least once per second. It's even more important to capture energy data over the entire year - seasonal weather variations have a notable affect on PUE.
2. Manage air flow: Good air flow management is fundamental to efficient data center operation. Start with minimizing hot and cold air mixing by using well-designed containment. Eliminate hot spots and be sure to use blanking plates for any unpopulated slots in your rack. We've found a little analysis can pay big dividends. For example, thermal modeling using computational fluid dynamics (CFD) can help you quickly characterize and optimize air flow for your facility without many disruptive reorganizations of your computing room. Also be sure to size your cooling load to your expected IT equipment, and if you are building extra capacity, be sure your cooling approach is energy proportional.