Cloud and virtual computing are changing the structure, assumptions, costs, even organizational plans and staffing decisions of corporate IT, but no one seems to know how to pay for them.
At least, they don't seem to know how to account for them.
Sixty-one percent of companies have processes in place to track cost and value of both cloud-computing and virtualized services to the company, according to a survey released last week by researchers at the Worldwide Executive Council, which concluded the jury is still out on cost justification of the cloud.
Not surprising considering the responses, from senior IT execs at companies with more than $50 million per year in annual revenues. Although most can track virtual- and cloud costs, 48 percent of respondents only give their bosses a spending reports on IT that doesn't break cloud or virtual computing out as separate items.
Twenty-one percent don't report any spending or ROI figures on cloud or virtual computing specifically.
Part of that, you'd think, is the relative novelty of cloud services and virtual-server infrastructures, but neither is exactly unfamiliar.
IDC and Gartner studies put the number of companies with some virtualized servers at close to 80 percent. Seventy-two percent of companies are using cloud-based services to some extent, according to a survey of CFOs released this week by financial analysis firm BDO USA.
According to WEC's survey, 10 percent said they don't track total costs of virtual or cloud services, and another 25 percent said they do track it, but do so badly.
Even so, 68 percent of them think that tracking will be "important" this year, while 20 percent think it will be "very critical."
I'm hoping the 20 percent that think tracking virtual IT costs aren't the same 20 percent who say they do it badly.
It would make IT look even more out of touch and foggy about business priorities than business-unit execs often accuse it of being.