Facebook makes its news all about the data center

Facebook has made a few data center announcements. So what’s the word?

Facebook’s making a lot of data center news these days.

I’m sure you’ve all heard the social media company will build a data center way up north, in Lulea, Sweden. Like many who opt to build data centers in cold lands, Facebook decided to build there so it could use outside air to cool servers. The facility is slated to be a set of three 300,000 square foot buildings, and is Facebook’s first data center outside the U.S. It's scheduled to be operational by 2012.

Also this week, Facebook talked up the Open Compute Project. First unveiled last April, the project is a partnership of Facebook, Intel, Advanced Micro Devices, Hewlett-Packard and Dell that aims to custom develop efficient yet inexpensive servers and data centers at the lowest possible cost, and then share those technologies as open source. You know, open source hardware.

So on to the update. Facebook and its partners announced the creation of a nonprofit foundation to run the project, as well as an advisory board to oversee things. It also unveiled the project’s guidelines a summary of its mission, and details on how projects will be proposed, evaluated and supported. For the record, the board includes Frank Frankovsky, director of hardware design and supply chain for Facebook, Andreas Bechtolsheim, an industry luminary who is co-founder of an data center switching company, Arista; Mark Roenigk from Rackspace; Jason Waxman from Intel; and Don Duet from Goldman Sachs.

Oh, and there was a sneak peak at some of the new member entrants, including ASUS, Dell, Mellanox, and Huawei; software suppliers like Red Hat, Cloudera and Future Facilities; enablers like DRT, Hyve (Synnex), Nebula, Baidu, and Silicon Mechanics; consumers like Mozilla, Netflix, NTT Data, Tivit, and the Open Data Center Alliance. Also participating from an institutional perspective are organizations like Georgia Tech University, North Carolina State University, and CERN. More, according to the project, will be revealed soon.

In a blog on the project’s site, the partnership says “a great deal of work remains to be done,” such as building its community and staying focused on delivering tangible results.

So… what’s the feedback on this endeavor launched a while back and now re-launching, as it were. I poked around on the Web to hear what folks were saying. Listen in:

Long-time tech journalist Barb Darrow writes in her blog here on Gigaom:

“The Open Compute Project (OCP) notion of standard, pre-specified server designs has to worry companies like HP, IBM and Oracle. They all sell lots “value-added servers” for which they charge additional costs. The project says third-party hardware companies can innovate atop its hardware stack, but for years these companies have promised standards support and interoperability while also claiming that their own components integrate better with their other components than with outside components.” (well said, Barb!)

And in this Wired blog, Eric Smalley says:

“Facebook’s gain is twofold: by participating in the foundation they can more effectively influence their suppliers to meet their needs, and they can wrap themselves in the “open” flag at a time when Google is increasingly perceived as the new Microsoft. There is some irony in this, given the closed nature of Facebook’s product.” (again, well said.)

What I say is standards efforts – no matter how critical and now necessary – move slowly and all too often are weighted down by the parties espousing them. That’s not to say industries shouldn’t push for standards, but it is a wearing effort. And while the OCP has feel of an open source endeavor, I’m not sure the look is the same. But I do give credit to an initiative that is driven by an end user, even if it is a behemoth one. So, I give this a 50-50 chance of making real progress. And being an optimist, I give it a glass half full.

What do you say?

ITWorld DealPost: The best in tech deals and discounts.
Shop Tech Products at Amazon