Virtual servers still the only really successful sibling

Virtual apps, PCs, phones, tabs and apps don't quite measure up to the original

Turns out this virtual server fad might just turn into something.

Market research company IDC is predicting that, by 2014, more than 70 percent of all the workloads running on a server will run on a virtual server. Of all the servers shipped, 23 percent will sell specifically as a base for virtual servers -- a total of 2.2 million machines that will become 18.4 million servers once you add in an average of 8.4 VMs per physical machine.

Guess that means your data center will clear out and there will be a lot of empty space where all the servers used to be.

Part of the study involved a survey of more than 400 organizations already using virtual servers, to see how they're using VMs and how that use will expand.

Based on that survey, IDC expects 2010 will be the first year more than half of all applications will run on a virtual machine rather than a real one.

IDC, Gartner, Forrester and a host of other analyst companies have been touting server virtualization as a justification for believing in all kinds of other virtualization -- mainly the desktop variety.

Virtual servers have a uniquely high return on investment, though. Adding virtualization allowed data centers to reduce the number of machines, reduce maintenance and overhead, and do it without substantial changes to existing applications, more new-server purchases than usual, or even a lot of training for most of the staff. Just the reduction in hardware and electricity is enough to pay off migrations, in many cases.

Desktop virtualization, on the other hand, often mean buying new client hardware to replace ancient PCs, adding new security and training for end users, beefing up the data-center servers even more because that's where the virtual desktops run, and expanding the network to let it carry all that extra traffic.

It also means assigning one support person to each business unit solely to explain people who still mistake their monitor for "the computer" where their desktops actually are and why they have to relaunch a fake PC every time they log on.

Even getting to that point requires a lot of training for the IT people, more storage, better data and user-profile management, performance optimization for networks, backup, recovery, archiving and compliance, and the need to buy and manage devices that could be PCs or could be dumb terminals.

It's more flexible than PCs, but is not a money-saving proposition.

Going one step further to virtualize tablets or smartphones means making uncomfortable partnerships with hardware vendors, forcing hypervisors, profile management and security onto a form factor that wants only to be Clever, Beautiful and Free (TM Apple Computer and Steve Jobs, license available only with fully licensed copy of Steve Jobs doll).

Virtualization, and eventually cloud computing, will, as IDC predicts in another end-of-year encyclical, change the world of computing.

A lot of it will be kind of a pain in the butt and only marginally worthwhile for a lot of people, though.

Maybe we should just stick with virtual servers. They seem to be going along just fine.

Kevin Fogarty writes about enterprise IT for ITworld. Follow him on Twitter @KevinFogarty.

What’s wrong? The new clean desk test
Join the discussion
Be the first to comment on this article. Our Commenting Policies