Virtualization, Real or Not?

By  

There's a renewed push toward virtualization. And with good reason. IT departments can cut server headcount dramatically, reduce electrical consumption and air conditioning loads, simplify administration, deploy new services quickly, and greatly improve reliability. It's for real.

In a major 2007 study, Gartner reported that companies operating more than 200 servers each wasted between $500,000 and $720,000 a year. Nothing to sneeze at. The culprit is dedicated servers, each typically running a single application whose average daily processor utilization hovers at a mere 7 percent. Yikes. Peak-period processor loads rarely top 15 percent. Gartner concluded that datacenters have excess computing capacity of an astonishing 20 percent to 50 percent. Double yikes.

Hewlett-Packard and IBM are the two major market-share holders in server virtualization hardware with Microsoft and EMC's VMware offering up the hypervisors on the software side used to create multiple virtual servers on a single physical server. Those virtual servers, in turn, can run any network operating system, such as Windows Server, or Linux, to name just two.

The reliability factor often is overlooked. With a dedicated server, maintenance often requires bringing down the application it's hosting. In a virtual server environment, simply move the app from a virtual machine on one physical server to a virtual machine on a different physical box. The app never comes down, a huge boon to the business.

Look for much more in the next several months as virtualization solutions are pounded into the hearts and minds of users by vendors.

Join us:
Facebook

Twitter

Pinterest

Tumblr

LinkedIn

Google+

Answers - Powered by ITworld

Join us:
Facebook

Twitter

Pinterest

Tumblr

LinkedIn

Google+

Ask a Question
randomness