5 ways to cut your storage footprint

By Robert L. Scheier, Computerworld |  Storage, data storage

With the economy still shaky and the need for storage exploding, almost every storage vendor claims it can reduce the amount of data you must store. Trimming your data footprint not only cuts costs for hardware, software, power and data center space, but also eases the strain on networks and backup windows.

But how do you know which technique to use? First you have to understand how your business uses data and determine when the cost savings of data reduction are worth the resulting drop in performance.

The technique that's best for you depends not so much on the industry you're in as it does on the type of data you store. For example, deduplication often doesn't deliver significant savings for X-rays, engineering test data, video or music. But it can significantly reduce the cost of backing up virtual machines used as servers, for example. Here are five techniques to help reduce your stored-data volume.

1. Deduplication

Deduplication -- the process of finding and eliminating duplicate pieces of data stored in different data sets -- can reduce storage needs up to 90%. For example, through deduplication, you could ensure that you store only one copy of an attachment that was sent to hundreds of employees. Deduplication has become almost a requirement for backup, archiving and just about any form of secondary storage where speed of access is less important than reducing the data footprint.

Chris Watkis, IT director at health care advertising and marketing firm Grey Healthcare Group, is seeing reduction ratios as high as 72:1 for backup data, thanks to a deduplication process that uses FalconStor Software Inc.'s Virtual Tape Library storage appliance. And cloud storage services vendor i365 is achieving 30:1 to 50:1 reductions in data on a mixed workload of Microsoft Exchange, SharePoint, SQL Server and VMware virtual machine files, says Chief Technology Officer David Allen.

Data can be deduped at the file or block level, with different products able to examine blocks of varying sizes. In most cases, the more fine-grained assessment a system can do, the greater the space savings. But fine-grained deduplication might take longer and therefore slow data access speeds.

Deduplication can be done preprocessing, or inline, as the data is being written to its target; or postprocessing, after the data has been stored on its target. Postprocessing is best if it's critical to meet backup windows with fast data movement, says Greg Schulz, senior analyst at The Server and StorageIO Group. But consider preprocessing if you have "time to burn" and need to reduce costs, he says.


Originally published on Computerworld |  Click here to read the original story.
Join us:
Facebook

Twitter

Pinterest

Tumblr

LinkedIn

Google+

Answers - Powered by ITworld

ITworld Answers helps you solve problems and share expertise. Ask a question or take a crack at answering the new questions below.

Ask a Question