In other words, don’t be “reactive” with your disk and dedupe purchases. Most projects fail by taking a temporary band-aid approach and fixing only one part of the storage problem rather than addressing the whole problem. Instead, consider how to get the most out of a dedupe purchase to address challenges for today and the future. How much data will I have in five years? Could I eliminate all trucking of tapes? Can I consolidate my operations from multiple sites to a few core datacenters? By asking the right questions, users will be better suited to think holistically about dedupe needs across primary, secondary and archive storage.
2. Consider the entire architecture to achieve best results. Deduplication isn’t all about getting the most high-performance (and high cost) solution on the market. Not all backups have the same requirements, so why should they all be deduped the same way? Users should look for ways to “right size” and “right price” the dedupe strategy. This can be accomplished by leveraging source dedupe and archiving to reduce data at the beginning to be as efficient as possible, and to reduce the price of deduplication. You can then leverage high-performance hardware for tier-one workloads and leverage replicated disk solutions to challenge the cost of high-end disaster recovery.
3. Archiving may be the most important tool in the dedupe arsenal. Primary storage is the fastest growing area where data redundancy problems exist (email, file systems, etc). As end-users share more information than ever before, it creates the need for a common platform for long-term retention and intelligent information management. With software based archiving, organizations can achieve an 80 percent reduction in primary application storage. With less data to protect, backups will be faster and application performance increases with more options for server consolidations.
For most organizations, the best solution is to move dedupe closer to information sources by integrating into backup, whether it is software or hardware. The result is that organizations can stop buying storage and re-use what they have, while also recovering data faster, whether it is an individual file, or an entire server. This approach also allows organizations to increase their return on virtualization by consolidating storage as well as servers. By working with vendors that are ‘open’ and provide a choice in deduplication approaches, organizations will be able to leverage multiple deduplication technologies and deploy deduplication everywhere.
Matt Kixmoeller is Vice President, Enterprise Product Management, Information Management Group, Symantec Corp.