May 16, 2012, 10:23 AM — SAP seems to be betting its future on its HANA in-memory database, spotlighting the technology once again at the Sapphire conference in Orlando Wednesday, announcing a slew of new applications, partnerships and functional enhancements for the system.
Thanks to a "shopping spree" by SAP in recent years, the company has been able to surround HANA with a wide variety of important supporting tools, co-founder Hasso Platter said during a keynote address Wednesday, which was webcast. "We have enriched the system. We think that technically, we have a pretty complete collection of functionality."
SAP, which has also been releasing a series of specialized HANA-based applications, announced eight new ones on Wednesday, targeting areas such as sales pipeline analysis, planning and consolidation, cash forecasting, and deposit management. It wasn't immediately clear when the applications would be generally available.
In addition Wednesday, SAP sought to align HANA with the industry buzzword "big data," which refers to the huge volumes of unstructured information being generatd by social media, sensors and other sources, as well as related efforts to draw valuable business insights from it.
SAP announced that the fourth service pack for HANA is now generally available and includes an integration with Hadoop, the increasingly popular open-source framework for large-scale data processing. This tie-in will provide the ability to read and from and write to the Hadoop Distributed File System and fast batch updating to HANA as well as the Sybase IQ analytic database, SAP said.
SAP has also formed a "partner council" around HANA, with the centerpiece being an agreement with Cloudera, which provides Hadoop services as well as its own distribution of Hadoop.
Riding the big data theme, SAP was also especially keen on Wednesday to prove that HANA can be scaled out to colossal heights.
The company demonstrated a 100-node HANA cluster, built with IBM hardware, that had 100TB of RAM and 4,000 Intel CPU cores.
The system, which cost US$4 million for the hardware, "is the world's largest in-memory database system assembled ever," SAP executive board member and technology chief Vishal Sikka said in an interview before the conference.
It is capable of running 1,000 applications for "millions" of users, or can simply take a single database query and run it at warp speed, Sikka said. "This nonsense about scalability, we'll shoot it in the head."
"We're testing the limits of how far we can go," Sikka added, during a keynote talk on Wednesday at Sapphire.
Customers will be invited to upload their data to the massive cluster and run jobs, although their activities will be limited to experiments, not production deployments at this time, Sikka said in the interview.