What was sad about the sale was that our internal study clearly said selling this business unit to Siemens would be a catastrophic failure for the unit. This should have prompted IBM to either sell to someone else or make the sale final. Instead, IBM sold half of ROLM to Siemens and carried 50% of the resulting losses for five years. The unit lost more money over that time than IBM initially paid to acquire it. Subsequent analysis showed that, had IBM just shuttered the business, it would have been billions of dollars ahead.
The ROLM mistake happened because the decision was made before the research was done. Apparently the IBM executive team forgot it had even commissioned the research in the first place. If you're going to do research, it needs to come before the decision is made-afterwards, as this decision shows, it has the high probability of making executives look like idiots.
However, executives have to be able to accept that the decision they want to make-quickly divesting a troubled unit in this case-may be a bad one. If they aren't, then you still have another problem: confirmation bias, or the tendency to only see information that agrees with your world view.
We saw this play out in spades in the recent U.S. presidential election, which the Republicans were convinced they were going to win, and win big, but they lost big.
The GOP had used analytics, but not only was it using companies inexperienced in political campaigns, it was cherry picking and reporting the results that supported the belief that Mitt Romney was going to win. As a result, Republicans focused on the wrong geographies, under-resourced their efforts and lost an election against a relatively unpopular incumbent.
You Can't Handle the Truth, So You Make Bad Decisions
The famous courtroom scene from the movie A Few Good Men highlights the core problem: Often "the truth" is at best inconvenient and at worst highly embarrassing. Analytics, done right, provides an incontrovertible view of the truth.