10 tips for boosting network performance

Accelerate the pace of your business with these 10 tech tips tailored for speed.

By  

Virtualization has to be just about the coolest thing to happen to the enterprise datacenter in a long time. It offers a multitude of manageability and monitoring benefits, scales cleanly, makes disaster recovery simpler than ever before, and dramatically decreases the number of physical servers you need chewing up power and spewing out heat.

[ For more on how best to implement virtualization, download InfoWorld's Server Virtualization Deep Dive and InfoWorld's Storage Virtualization Deep Dive. ]

If used incorrectly, however, virtualization technology can shoot you in the foot. Remember, virtualization isn't magic. It can't create CPU, memory, or disk IOPS out of thin air.

As you grow your virtualization infrastructure, it should be fairly easy to keep tabs on CPU and memory performance. Any virtualization hypervisor worth its salt will give you visibility into the headroom you have to work with. Disk performance, on the other hand, is tougher to track and more likely to get you into trouble as you push virtualization to its limits.

By way of example, let's say you have a hundred physical servers you'd like to virtualize. They're all essentially idling on three-year-old hardware and require 1GHz of CPU bandwidth, 1GB of memory, and 250 IOPS of transactional disk performance.

You might imagine that an eight-socket, six-core X5650 server with 128GB of RAM would be able to run this load comfortably. After all, you have more than 20% of CPU and memory overhead, right? Sure, but bear in mind that you're going to need the equivalent of about 140 15,000-rpm Fibre Channel or SAS disks attached to that server to be able to provide the transactional load you'll require. It's not just about compute performance.

Performance tip No. 9: To dedupe or not to dedupe

As your data grows exponentially, it's natural to seek out tools that curb the use of expensive storage capacity. One of the best such examples is data deduplication. Whether you're deduplicating in your backup and archiving tier or directly to primary storage, there are massive capacity benefits you can derive weeding out similar data and storing only what is unique.

Join us:
Facebook

Twitter

Pinterest

Tumblr

LinkedIn

Google+

NetworkingWhite Papers & Webcasts

See more White Papers | Webcasts

Answers - Powered by ITworld

Join us:
Facebook

Twitter

Pinterest

Tumblr

LinkedIn

Google+

Ask a Question
randomness