Complexity of IT systems will be our undoing

By John Dix, Network World |  Networking, Complexity

Roger Sessions, CTO of ObjectWatch and an expert in software architecture, argues that the increasing complexity of our IT systems will be our undoing.  In fact, he just recently got a patent for a methodology that helps deal with complex IT systems. Network World Editor in Chief John Dix recently caught up with Sessions to get his take on the extent of the problem and possible solutions.

The 10 dumbest mistakes network managers make

Outline the IT complexity problem as you see it.

The basic problem is the larger and more expensive an IT project is, the more likely it is to fail. You can do a lot of analysis as to why that is. You can say maybe we're not using the right methodology, or communications is failing, or any number of things. But ultimately the only variable that appears to correlate closely with failure is complexity.So my basic proposal is that as systems get bigger and more expensive they get more complex and complex things are harder to deal with and therefore more likely to fail. So if the system is under, say $750,000, it has a good chance of succeeding. Once it approaches $2 million it has less than a 50% chance of succeeding. And by the time it gets much larger than that, the chances of success drop to near zero.

Clarify what you mean by failure? 

The way I define failure is that, if at the end of a project the business looks back and concludes it would not have taken on the project knowing what it knows now, then the project is a failure. 

How about some examples.

It could be a system that was abandoned because it got too complex or people got confused and started going off on tangents. An example would be the National Program for Information Technology in the U.K. It was an attempt to build an IT infrastructure for their healthcare system. It was probably a $20 billion effort, and it was abandoned and now they're starting from scratch. 

$20 billion?

Yeah. It's a system I wrote about in my last book three years ago. At the time it hadn't yet failed. But the system was huge, highly complex, and there were no policies in place to deal with the complexity. I predicted that it could not succeed. And just in the last couple of weeks they've announced that they're discontinuing the system.

That's an extreme example. Most failures are in the $2 million to $4 million range. If you look at studies of how many systems in this range are successful, the number is less than 50%. 

So the question is not, are we failing? We know we're failing. The question is, why aren't we doing something about it? Why do we keep doing the same thing over and over again if we can see very clearly how unsuccessful it is?

Before we discuss that, those numbers are based on your own research or other people's research?


Originally published on Network World |  Click here to read the original story.
Join us:
Facebook

Twitter

Pinterest

Tumblr

LinkedIn

Google+

Answers - Powered by ITworld

ITworld Answers helps you solve problems and share expertise. Ask a question or take a crack at answering the new questions below.

Join us:
Facebook

Twitter

Pinterest

Tumblr

LinkedIn

Google+

Ask a Question
randomness