NoSQL offers users scalability, flexibility, speed

User case studies at the NoSQL Now conference show NoSQL being used for a variety of reasons

By , IDG News Service |  Software

Users of NoSQL databases and data processing frameworks such as CouchDB and Hadoop are deploying these new technologies for their speed, scalability and flexibility, judging from a number of sessions at the NoSQL Now conference being held this week in San Jose, California.

EMC is using a mixture of traditional databases and newfangled NoSQL data stores to analyze public perception of the company and its products, explained Subramanian Kartik, distinguished EMC engineer, during one talk.

The process, called sentiment analysis, involves scanning hundreds of technology blogs, finding mentions of EMC and its products, and assessing if the references are positive or negative, using words in the text.

To execute the analysis, EMC gathers the full text of all the blog and Web pages mentioning EMC, and compiles them into a version of MapReduce running on its Greenplum data analysis platform. It then uses Hadoop to weed out the Web markup code and non-essential words, which slims the data set considerably. It then passes the word lists into SQL-based databases, where a more thorough quantitative analysis is done.

The NoSQL technologies are useful in summarizing a huge data set, while SQL can then be used for a more detailed analysis, Kartik said, adding that this hybrid approach can be applied to many other areas of analysis as well.

"There is all sorts of information out there, and at some point you will have to go through tokenizing, parsing and natural language processing. The way to get to any meaningful quantitative measures of this data is to put it in an environment you know can manipulate it well, in a SQL environment," Kartik said.

For digital media company AOL, NoSQL products provide speed and volume that would not be possible using traditional relational databases.

The company uses Hadoop and the CouchDB NoSQL database to run its ad targeting operations, said Matt Ingenthron, manager of community relations for Couchbase, during another talk.

AOL has developed a system that can pick out a set of targeted ads for each time a user opens an AOL page. What ads are chosen can be based on the data that AOL has on the user, along with algorithmic guesses about what ads would be most of interest to that user. The process must be executed within about 40 milliseconds.

Source data is voluminous. Logs are kept on all users' actions on every server. They must be parsed and reassembled to build a profile of each user. The ad brokers also set a complex set of rules of how much they will pay for an ad impression, or what ads should be shown to which users.

This activity generates 4 to 5 terabytes of data a day, and AOL has amassed 600 petabytes of operational data. The system maintains more than 650 billion keys, including one for every user, as well as keys for handling other aspects of data as well. The system must react to 600,000 events every second.

Join us:
Facebook

Twitter

Pinterest

Tumblr

LinkedIn

Google+

Spotlight on ...
Online Training

    Upgrade your skills and earn higher pay

    Readers to share their best tips for maximizing training dollars and getting the most out self-directed learning. Here’s what they said.

     

    Learn more

Answers - Powered by ITworld

ITworld Answers helps you solve problems and share expertise. Ask a question or take a crack at answering the new questions below.

Join us:
Facebook

Twitter

Pinterest

Tumblr

LinkedIn

Google+

Ask a Question
randomness