Big data, cheap storage bring in-memory analytics into spotlight

By Allen Bernard, CIO |  Big Data, Analytics, in-memory database

"They're saying, 'Hey, we're putting 10 gigs of memory into our product capability because that's all it can handle, but were calling it an in-memory solution,'" Nakamura says. The question, he adds, is whether they can scale to handle real-world problems and data flows. (To be fair, Terracotta has just released two competing products, BigMemory Max and Big Memory Go, the latter of which is free up to 32 GB. Both products scale into the TB range and can run on virtual machines or in distributed environments.)

In-Memory Technology Removes Latency From Analytics

"What is comes down to," says Shawn Blevins, executive vice president of sales and general manager at Opera Solutions, is that each product has "an actual layer where we can stage the data model itself, not just the data-and they exist in the same platform and the same box in flash memory."

Perspective: How Big Data Brings BI, Predictive Analytics Together

From a business point of view, this is really what matters. In-memory technology gets complicated quickly. If you want to understand how all the bits and bytes line up, then it's probably best to call down to your IT guys for another rousing round of "What's that part do again?" However, if you want to understand why in-memory is becoming the buzzword du jour, that's a little easier: It provides business insights that lead to better business outcomes in real-time.

Essentially, in-memory analytics technology lets businesses take advantage of performance metrics gleaned from production systems and turn those into KPIs they can do something about. A company such as Terracotta can give away 32 GB of capacity because in-memory analytics doesn't require the entire fire hose of data that a traditional BI app needs in order to produce useful results.

"The deal with in-memory analytics is the analysis process is all about search," says Paul Barth, co-founder of data consulting firm NewVantage Partners. You're trying to see how many different combinations of things, such as blue car owners and ZIP code, are correlated, he adds.

For every one of those correlations, it takes time to pull the data, cluster it, notice the dependencies are and see how strongly one variable is affected by the others. Every time you pivot that table to find something new or get some clarity, data moves and gets reorganized. That introduces latency-which is the problem in-memory analytics is precisely designed to defeat.

High-Frequency, Low-Computation Analysis-For Now


Originally published on CIO |  Click here to read the original story.
Join us:
Facebook

Twitter

Pinterest

Tumblr

LinkedIn

Google+

Answers - Powered by ITworld

Ask a Question
randomness