Redshift will fit in with AWS's other data storage products, the most popular of which is its Simple Storage Service (S3). AWS launched Glacier earlier this year as a long-term, inexpensive storage option. AWS also already has DynamoDB, which is a solid-state drive managed NoSQL database that is highly scalable and fault tolerant. It also has Elastic MapReduce, which is a Hadoop-based analytics platform. MapReduce allows users to spin up scalable Hadoop clusters using Amazon's Elastic Compute Cloud (EC2) and S3. It's used for web indexing, data warehousing, machine learning, data mining and log file analysis, among other tasks, Amazon says.
There are a variety of third-party tools already on the market that provide big data analytics. AWS recently launched a "Big Data" section on its Marketplace, which are applications that are designed to run on Amazon's cloud. SAP's HANA One, Sumo Logic, Metamarkets and Splunk Storm are some of the apps designed for big data analysis already on Amazon's Marketplace. AWS also already partners with companies like Think Big Analytics, MarketShare and MapR for its Elastic MapReduce function.
Gartner's Adrian says he expects businesses of all sizes to aggressively investigate potential use cases for Redshift, but traditional concerns around what sort of data business are comfortable storing in the cloud vs. on their own premises will persist.
In addition to Redshift, the other major announcement from the first day of AWS's user conference was another price reduction, this time for its S3. This follows a drop in prices for EC2 instance types just a few weeks ago. S3 instances dropped on average by 25% across the board. Up to 1TB of data stored in Amazon's cloud has dropped in price from $0.125 to $0.095 per GB, a 24% drop. Up to 4,000 TB of storage dropped in price from $0.08 to $0.060, a 25% drop.