Convincing decision makers to use the results can be as difficult as getting them to go along with the project in the first place, because the predictions may be the exact opposite of what their business intuition tells them, says Anne Robinson, president-elect for the Institute for Operations Research and the Management Sciences (Informs) the professional society for business analytics. "As you get more involved with analytics it becomes counter-intuitive. But it's those deviations from what you're doing that bring the rewards, because when the results are intuitive you find that most people are already doing them."
Several years ago Cisco Systems created "propensity to buy" models -- to figure the probability that customers will buy this quarter or next, or never. The models cover every product in every sales territory. The salespeople felt they already knew what some of the people identified by the model were going to buy, so Cisco excluded those sales when calculating the return on its effort. "The first year we did it, we generated $1 billion in sales uplift," says Theresa Kushner, Cisco's senior director of customer and influencer intelligence. "We had an experience to line up against what they thought they believed."
Peri learned the hard way that 80% of a predictive analytics project is cultural. "I came in naively thinking that if I had a model that does all of these great things it will just work. But you have to be aware of how people make decisions and how it will transform that process."
P&G once developed a model designed to provide an "early warning" on how each business was going to perform. "It was actually quite accurate, but the warnings were given in such as way that people didn't understand how to take action on them, and so we didn't get the proactive decisions we wanted," he says. Lesson learned: "Analytics is only valuable when you take action on the insight."
People can also feel threatened by analytics. "There's a concern initially that the model is designed to take over decision-making or doesn't respect my business knowledge," Peri says. Users need to understand that the predictive model serves as a decision support tool and how to use the output in their own decision-making processes.
Don't waste time trying to get people to believe in the model, says Cisco's Kushner. Instead, do a test and present the results. In this way you're not countering their knowledge: The science is. "This is math; this is fact; this is statistics. You have an experiment to line up against what they thought they believed."
Ultimately, however, predictive analytics is forcing a showdown between data-driven and intuition-based decision making, says Eric Siegel, president of the analytics training firm and conference organizer Prediction Impact Inc. "That's the big ideological battle. It's a religious debate."
Data: Getting to good enough
On the technology side, both building the model and preparing the data can be stumbling points. Predictive analytics is an art as well as a science, and it takes time and effort to build that first model and get the data right, says Abbott. "But once you build the first one, the next one is much less expensive to model" -- assuming you're using the same data. Analysts building a new, entirely different model that uses different data might find that project just as time consuming as the first. Nonetheless, he says, "The more experience one gains, the faster the process becomes."
Data preparation issues can quickly derail a project, says Siegel. "The software vendors skip that point because all of the data in the demo has already been put into the correct format. They don't get into it because it's the biggest obstacle on the technical side of project execution -- and it can't be automated. It's a programming job."
When the Magic's Perez got started in 2010 he grossly miscalculated the time it would take to prepare the data. "We didn't set the right expectations. All of us were thinking that it would be easier than it was," he says. Pulling together data from Ticketmaster, concession vendors and other business partners into a data warehouse took much longer than anticipated. "We went almost the entire season without a fully functional data warehouse. The biggest thing we learned was that this really requires patience," he says.
"Everyone is embarrassed about the quality of their data," says Elder, but waiting until all of the data quality issues are cleaned up is also a mistake. Usually, he says, the data that really matters is in pretty good shape. "I urge people to go ahead and make a salad and see what you can get," he says.
Blue Health Intelligence (BHI) had no issues with the patient health care data coming from its 39 Blue Cross Blue Shield affiliates -- but with seven years of data about 110 million members, they had a lot of it. "Health care is way behind in analytics because of the complexity of our data," says Swati Abbot, CEO. "People tend to run after the data and not know why they need it." Clinical insights must come first, she says. "Then the math takes over."
BHI developed models to predict which of its highest risk members were most likely to be hospitalized, who had an avoidable risk, who was most likely to respond to intervention and which actions were most likely to work in each case.
Initially, BHI focused on diabetes patients at highest risk for hospitalization -- a patient group expected to cost the healthcare industry $500 billion by 2020. "The first round [of analytics projects] was not successful because the business stakeholders weren't involved. It became an IT exercise," she says, and because the reports didn't say why a given patient was at high risk, clinical professionals ignored them.
So Abbot made sure that they understood the underlying "risk drivers" in the model, explaining not just who was at high risk but why, and what interventions would be mostly likely to improve the outcome.
"Provide transparency," she advises, "and serve up the information in the right way, so it's not disruptive to workflows."
Iterate first, scale later
At Intuit every project starts small and goes through continuous cycles of improvement, says George Roumeliotis, data science team leader. "That's our process: Iterative and driven by small scale before going big." The financial services business started using predictive analytics to optimize its marketing and upsell efforts, and now focuses on optimizing the customer experience with its products.
Intuit developed predictive task algorithms to anticipate how customers will categorize financial transactions in products such as Mint and QuickBooks, and makes suggestions as they enter new transactions. It also proactively offers content and advice as customers use its products in an effort to anticipate questions before the user has to ask.
"Start with a clearly articulated business outcome, formulate a hypothesis about how the process will contribute to that outcome, and then create an experiment," he says. Through A/B testing, analysts can gain the confidence of business leaders by creating parallel business processes and demonstrating a measurable improvement in outcomes.
Just be sure to start by choosing an existing business process that can be optimized with minimal risk to the business, he advises. Customer support, retention and user experience are great places to get started.
While predictive analytics projects can require a substantial investment up front, return on investment studies show that it does make an impact, as Cisco's experience shows. Ultimately, even small-scale projects can have an enormous impact on the bottom line. "Predictive analytics is about projecting forward and transforming the company," says Peri.
The risks are high, but so are the rewards, says Informs' Robinson. "Take it to the end," she says. "Be successful. And act on what you learn."
Read more about bi and analytics in Computerworld's BI and Analytics Topic Center.
This story, "Putting predictive analytics to work" was originally published by Computerworld.