The majority of both U.S. and global corporations are forging ahead with plans to virtualize their servers, applications, desktops and smartphones, and build cloud-computing infrastructures on top of all that virtual landscape, despite experience with both cloud and virtualization technology that has largely failed to meet their expectations.
Despite the failure of many virtualization and cloud-computing technologies to live up to even the jaded expectations of top IT executives and project managers, most companies are forging ahead with plans to virtualize their servers, applications and PCs, then build clouds on top of all that virtual infrastructure according to a survey released today by Symantec Corp.
Symantec's results are based on questions asked of more than 3,700 respondents in 35 countries – which makes it one of a comparatively small number of surveys done in the IT industry every year big enough to reach any level of statistical accuracy at all.
IT shows, to a certain way of looking at it, that the people making decisions about virtual technologies at major corporations are either unrealistic optimists, unbelievably forgiving of disappointment, or are idiots.
The study measured how well a technology performed by asking respondents to compare how well the technology performed after implementation with the performance goals they set before implementing it.
It asked respondents about the virtualization of servers, storage, desktops, storage-as-a-service, and hybrid or private clouds.
Performance of every one of them fell below expectations.
Virtual servers were the least disappointing, with real performance falling only 4 percent below pre-determined performance goals.
Private storage as a service was most disappointing – underperforming by 37 percent.
Storage virtualization underperformed by 33 percent.
Desktop virtualization technologies fell 26 percent below their performance goals.
Private and cloud computing performed 32 percent below par.
All of which, in a rational world in which hard cost/benefit calculations drive the decisions of decision-makers unswayed by hype and buschwa would normally mean at least three of the four entire categories of product technologies would drop off the IT budgets for next year.
Technology decisions aren't made that way, according to Gartner's Hype Cycle reports, which include points in the research-buy-implement-repair-discard cycle called the "Peak of Inflated Expectation" and the "Trough of Disillusionment."
People buy technology hoping it will do wonderful things, but realizing it may not live up to its billing without a lot of help from them and the eventual need to lower their expectations to the real functions a product can deliver rather than those users and vendors hoped it could, according to John McGee, VP of product marketing for Symantec.
It's all part of the growing process for markets growing out of their immaturity by beginning to deliver technology that's functional as well as buzzwordy – a process that continues until successful technologies become reliable enough to be used in mission-critical applications while being too boring to distract anyone from the exciting, new non-functional technology that just came along.
It's a process split between developing new functions for the technology and changing IT or business organizations enough to take advantage of strengths inherent in the technology itself, McGee said.
Virtual servers almost hit expectations because they're far more of a known quantity than the other technologies, have a much longer track record and are much more widely implemented and understood, McGee said.
Storage-as-a-service fell so far short of expectations partly because so few IT execs know much about it or have tried to make it work, and partly due to its having been on the market for a very short time.
Without much experience with a new technology it's possible to imagine that, in the very near future, it will do things far more effectively than evidence shows it currently can.
That's both a bit of potentially destructive flavor of denial and the kind of optimism that keeps us getting up in the morning despite the virtual certainty that today won't be dramatically better than tomorrow.
It's also a reflection of the constant lack of satisfaction that keeps most people, especially technically minded ones, trying to build or tweak or configure things to improve the way they do their jobs, drive to and from work and clean up the house afterward.
That's why technology eventually catches up with our expectations, but only yesterday's expectations. Today's expectations have already advanced.
For example, right now Symantec's study shows hybrid/cloud products require too much time to provision new resources (39 percent), don't scale far enough (34 percent) and have too little security (29 percent) compared to goals end users set for it.
A year or two from now the provisioning and scalability might improve so far that they almost meet the expectations cited in the study.
But those were 2010 expectations. Two years from now they'll be 2012 expectations, which will include flying cars and robot butlers. Hybrid clouds won't have those things by then, so they'll still fall short of expectations and will still disappoint the people paying for them.
That doesn't mean they stink. They just don't evolve quite as quickly as our expectations for them.
After decades building or using advanced IT systems, you'd think most of us would realize by now, rather than assuming it's some kind of Murphy's Law effect that the system you really want is the one that won't be ready for another 6 to 12 months.
So paying top dollar for bleeding edge technology does seem idiotic if what we want is functionality two years beyond the bleeding edge.
Unfortunately, there's no way to get to our future disappointment without working our way through this one first.
Which, frankly, is a little disappointing, to tell the truth, but I'm learning to deal with it.