SAS performs 18 hour analytics job in 2.5 minutes

By Jennifer Kavur, ComputerWorld Canada |  Data Center, Analytics, SAS

"On these kinds of applications where we are looking for massive performance, you don't want to be virtualized because you are going to lose 30 per cent ... we want to operate the system as sitting right down on top of the metal," said Goodnight.

The technology, which works cross-industry, targets markets like finance and retail. SAS reduced markdown optimization from 28 hours to 3.5 hours for one retail client with over 1,000 stores and 200,000 to 300,000 skus per store. This provides more time for the client, which runs analytics after its stores close Saturday night, to notify managers about sales before Monday morning.

SAS's development is very advantageous from a processor perspective because enterprises can build a high performance, robust SAS engine out of cores and components they can readily access -- without having to necessarily invest in a product for platform computing or some other grid-based computing engine, Russ Conwath, senior analyst at Info-Tech Research Group Ltd.

"There are other folks out there who do in-memory compute, but this seems like a really good addition to the SAS portfolio and something that is going to potentially differentiate them from the other folks because of its scope and scale," he said.

Highlighting recent announcements from Advanced Micro Devices Inc. (AMD) and Intel Corp., Conwath said, "it is all about the core count and all about how much memory you can blow into these machines." With SAS's model, "you could scale out quite a bit," he said.

The interesting question here is the use case for this new processing architecture, said Gareth Doherty, senior analyst at Info-Tech. "This product is very likely going to be focused in the finance industry where you've got extremely data-intensive predictive models that are looking to forecast where the marking is going and there is a lot of information that you have to process on the fly in order to set pricing information," he said.

A lot of organizations in these scenarios are already leveraging sophisticated statistical analysis tools like SAS, added Doherty. "I think what (SAS) is looking to do is entrench their market presence by also offering an infrastructural solution that allows them to be able to leverage their analytical capabilities, but do so in a way that also gives them a much better response time so they are no longer limited in the hardware level," he said.

In-memory processing in the context of BI isn't new -- mega-vendors like IBM Corp. and MicroStrategy Inc. already have in-memory solutions, and even Microsoft Corp.'s SQL Server is going to offer in-memory processing capabilities, said Doherty.

Join us:






Answers - Powered by ITworld

Ask a Question