August 25, 2004, 2:24 PM — IT professionals dream of robust networking environments that are capable of processing weekly payroll, monthly commissions, and end-of-year accounting -- A/R, A/P, General Ledger "close outs" -- while at the same time maintaining their daily ERP, CRM, and e-mail systems. Most servers, even in extreme conditions, rarely reach maximum processing power. In fact, in a typical work day environment, most servers (particularly Windows) rarely surpass a 10% utilization rate.
Fortunately for IT professionals, virtualization is making the dream a reality.
Although most companies are not taking advantage of virtual server expansion and contraction capabilities today, it is possible to "borrow" CPU and memory capacity from other servers that are not being heavily taxed. When it is no longer needed, that borrowed capacity can then be returned to its original owners in its original state. Imagine spoofing servers into thinking they have unlimited CPU and memory capacity and as a result never running into processing/workload thresholds.
Engineers at Evolving Solutions, Inc., a data disaster recovery, storage architecture, and business continuity solutions provider, predict that by the end of 2004 and into early 2005 servers that auto-monitor and auto-adjust for data-on-demand requirements will become common in larger IT shops. Servers that are able to auto-adjust to continuously changing CPU and memory needs will become as widely accepted as the current "cascading servers" methodology. More than simply a foray into virtualization, this is a complete leap into autonomic computing.
Local server virtualization
Processing power needed for multiple employees to open large files located on a single server can push CPUs and memory past pre-defined thresholds that are typically set at 70%-80%. When they exceed their thresholds, the lack of processing power drastically inhibits data and document retrieval speeds across your LANs and WANs. This often results in hard dollar costs (replacing smaller servers with larger ones or clustering existing servers) and soft dollar costs (mainly from lost employee productivity). Grow this scenario into an online transaction processing (OLTP) environment and you can imagine how rapidly costs would mount.
Take the example of Local Books, a small fictional company that sells books written by local authors from their store on Main street. The first day they launched their online shopping site, they received 30,000 hits and hundreds of attempted transactions. Because they had not effectively planned for this activity, they found their OLTP and backend database server being significantly taxed.