March 18, 2012, 7:15 AM — The perception of mainframe technology as outmoded or inefficient is wildly inaccurate in a number of important ways, according to IBM's chief architect for cloud computing, Frank DeGilio.
In a presentation at the Share user conference Wednesday entitled "Hex, Lies and Videoblogs," the IBM mainframe expert argued that the conventional wisdom on big iron is plagued by several myths.
• The price is wrong
"When people talk about the cost of computing, what they're generally thinking about is the cost of the hardware and the cost of the software," DeGilio says. However, this ignores any number of other major expenses involved. Particularly for large-scale infrastructures, management complexity and personnel costs are often critically important parts of a system's final price tag.
As infrastructures expand, the number of people required to run a distributed system tends to stay significantly higher than that of a comparable mainframe-based alternative, he asserts.
• Ancient languages and old frameworks
According to the IBM expert, the idea that mainframes only deal with outdated programming languages like COBOL and Assembly is also a myth. J2EE, Linux and other modern open standards are all widely supported, though COBOL is still important.
What's more, he added, there's nothing outdated about the way mainframes handle workload management. In fact, their ability to fine-tune resource allocation based on application need is far more granular and sophisticated than that of most distributed systems.
• Break, don't bend
That same highly advanced ability to balance workloads, DeGilio said, gives the lie to the idea that mainframes are inflexible. Moreover, the very concept of capacity upgrade on demand was "pioneered" by the mainframe, he noted.
• Slow and steady wins nothing