Getting the most out of flash storage

By Gary Orenstein, vice president of product and technical marketing, Fusion-io, Network World |  Storage, flash storage

This vendor-written tech primer has been edited by Network World to eliminate product promotion, but readers should note it will likely favor the submitter's approach.

Over the past few years mainstream enterprises have been turning to NAND flash storage to boost speed and decrease latency, but some vendors still produce products that inhibit customers from achieving flash's full potential.

Solid-state storage offerings that integrate NAND flash as they would traditional disk systems put data far away from the CPU, often behind an outdated storage controller. No matter how fast the NAND is, this setup creates latency, ensuring the application sees only small improvements in actual throughput.

Let's take a step back and look at the pain of disk storage, the pitfalls of applying conventional architectures to flash, and how to achieve the full potential of NAND flash.

IN DEPTH: Flash storage in post-PC devices advances

The pain

The speed limitations of disk drives compared to CPUs are well known. Less well known are the disk acrobatics administrators have to endure to configure drives for performance. This includes buying expensive Fibre Channel disk drives and configuring them in complex schemes that use only a portion of the drive platter to boost performance, which means adding stacks of disks with largely unused capacity that administrators must monitor for failure (not to mention the costs for power, cooling and space to house the systems).

ANALYSIS: EMC: Flash could spell doom for Fibre Channel

But even with these acrobatics, disks often struggle to meet required performance levels due to the distance of external disk storage systems from the CPU, as shown in Figure 1. While CPUs and memory operate in microseconds, access to external disk-based systems happens in milliseconds -- a thousandfold difference. Even when disk systems can pull data quickly, getting the data to and from the CPU has a long latency delay causing CPUs to spend a lot of time waiting for data. This negatively impacts application and database performance.

The pitfall


Originally published on Network World |  Click here to read the original story.
Join us:
Facebook

Twitter

Pinterest

Tumblr

LinkedIn

Google+

Answers - Powered by ITworld

Join us:
Facebook

Twitter

Pinterest

Tumblr

LinkedIn

Google+

Ask a Question
randomness