TL;DR on last page!
It's true of love, life, and doubly so of business: When faced with uncertainty, we become slaves to the familiar, torturing logic to avoid discomfort, sacrificing all to appease the devils we know. Spend enough time around computers and you'll realize that humans execute their programming far more reliably. Need to rationalize something? There's an app for that. Technology brings uncertainty to a business world that approaches problems with alliterative repetition: predictably producing pointless pre-programmed paranoia.
Changing that relationship -- convincing others to suspend their code, entertain the unknown and re-evaluate what they take for granted -- is the most difficult and valuable thing that one can do for one's organization, career and life.
Bonfire of the oblivious
As technologists, we need to be clear: No matter what the business is, today technology is the business. The industry is irrelevant. While the '90s were all about computers replacing tools, the first years of the 21st century have been about technology replacing whole industries.
There are no more phone companies; instead we have technology companies that happen to specialize in communications. Cable companies no longer exist; they've been replaced by technology companies that happen to specialize in media distribution. The purists are all gone. In one sector after another, when technology is ready to invade, the core competency becomes technology, and "the business" becomes "whatever the technology allows it to be."
Publishers and newspapers, comfortable in their prestige, have been decimated by competitors that have ridden a wave of new technologies. Book and video stores are being submerged in the flood of online/digital distribution, having ignored the storm until it was upon them. The bulk of trading on the stock market occurs by algorithm, leaving traditional brokers and investors to fight over table scraps. Banks are so universally undifferentiated that people often choose one over another on the basis of website features and phone apps.
Technology allows every musician to be his own label while the establishment music industry suffers -- not from piracy, but from its own inexplicable attachment to a business model that can only be maintained through legal and legislative manipulation.
Polaroid considered itself very technology-oriented, making huge investments in digital imaging in the '80s, but it never brought a single digital product to market. Just as the music industry doesn't understand how to make money off of free music, Polaroid couldn't imagine a business model based on free "film." Polaroid hid in its comfort zone for another decade, and died there.
It doesn't have to be that way. Canon merged new tech with its familiar SLR cameras, inventively ushering in an era in which a $700 portrait camera is better at shooting movies than a $70,000 cinema camera from a few years ago. Canon competed against itself in its own markets, but in doing so, it has remained relevant and able to exert considerable influence on future events. Consequentially, the entertainment industry has exploded with new players and cheap, effective equipment. How will that ripple effect end for Hollywood? See: publishing industry, music industry, et al.
Ignorance is not a defense
Many blame "disruptive technology" -- innovations causing unexpected changes and unpredictable business outcomes. But, while the theory explains what happens, it implies an unexpectedness that rarely exists for an average technologist. Put another way, if someone stumbles on cold fusion today and puts it in stores for a couple of hundred dollars next week -- that is disruptive -- but that never happens. The innovations we deal with 99.9% of the time were predicted, chronicled and analyzed for years beforehand. What business analysts call "disruptive," technologists call "Tuesday" -- a highly predictable consequence of Monday.
What we're really talking about is "disruptive ignorance" -- an active denial of innovation causing forced, usually irrational, business reactions. Business is slave to the familiar, sold on the management logic of commoditizing, standardizing or outsourcing anything not viewed as "the core competency," choking off its own capacity to deal with innovation in an act of auto-technologic asphyxiation.
Consider a future victim. Academia is a complex industry with merits beyond education, but I will paint with a broad brush because the average consumer doesn't get to see the fine detail.
Just a few years ago, online degrees were met with much condescension in traditional academic circles. Yet enrollments show that people want them, surveys show that employers are warm to them, and once-obscure schools have become mainstream in the eyes of potential consumers. Academia has flirted with technology-based learning options for decades, but even today, we're still looking at Online Education 1.0 -- beta. At best, it's just a digital peepshow of the existing collegiate process -- replacing the classroom with video streams.
Academia invents much of the technology that revolutionizes other industries, but it has been conspicuously shy about upsetting its own apple cart. There are many reasons, but when you corner people on the question, it all comes back to comfort vs. uncertainty. Individually, everyone sees the potential, but collectively, there is great fear that higher education will be commoditized away -- that faculty won't matter in a technology-based model. The outcome is uncertain and there is a natural hesitancy to dive in. Teaching is considered a prestigious, creative process, inextricably integrated into research and engagement, all fueled by the organic nature of "academic freedom." Personally, I agree: Teaching is creative and important; I make similar arguments about IT -- but if that is to be preserved, the customer must be preserved.
Over the past 20 years, average tuition doubled while average income and starting salaries stagnated or decreased -- and that trend appears to be accelerating. Customers are much more open to alternatives, which means the whole industry is a prime target for an "iPhone moment." But ironically, after a decade of commoditizing and demoting technology, in a process aptly dubbed "the incredible shrinking CIO," higher ed has largely driven away the creative, highly motivated base of technologists it now needs to prevent its own commoditization. Zombifying IT through ITIL and other '90s-style engines of self-obsolescence has the same effect in all industries -- talent flees bureaucracy and joins a technology company, often returning to steal the business.
Inevitably, all universities will be technology companies that happen to specialize in education and research. It's not a question of money, or prestige, or history, it's a question of valuing and inspiring innovation. Take Kahn Academy, for example: One guy, two years, and 2,300+ videos later, anyone, anywhere with access to YouTube can get a foundational education for free. No, that's no substitute for a focused, accredited degree at the moment, but this is one guy. Imagine what two, or heaven forbid, three, could accomplish.
Perhaps, like Polaroid confronted with digital technology, you don't think there's a business model for cheap education. It wouldn't be "disruptive" if most people did. Perhaps you don't think Google would ever consider giving away a high-quality education to gain an impressionable, highly integrated and ever-growing captive audience. I think it would be foolish not to try. The philosophy of pricing and positioning a degree as a compulsory pre-career debt hassle can easily become higher ed's greatest liability. Hello, Google U.
It comes down to this: Academia is flush with brilliant people and access to technology. There is no industry better equipped to answer consumer desires through technology, creating a better, more marketable student in the process. But academia suffers the same way other industries do: Technical mastery and distinctiveness are not considered core to the academic business model, creative technologists are not hired to lead, and geeks are not inspired to create marketable innovation.
Like many industries before it, academia knows what's coming, but sees embracing uncertainty as far more threatening. It often happens that way: The key to survival comes disguised as the doorway to oblivion.
Trolling management, FTW
The least innovative sentence of the last 50 years is "We need to be innovative." It's easy to pound our chests and shout it proudly, because everyone agrees. But when it comes down to the decisions that actually influence innovation, organizations whisper from the safety of their comfort zones: "We're scared."
That fear is expressed in many ways, but the most destructive is also the most obvious: IT has a personnel problem.
In fact, it's become so commonplace over the last decade that no one realizes that it's a dysfunction. Here's how it works: Everyone understands that innovation is good -- in the abstract. When a group of non-technologists get together to hire a CIO, they're thinking "innovator" and "leader." They know technology is critical, but they feel a bit in the dark, uncertain of what to expect and how to "make innovation happen." That's where things go tragically wrong and programming takes over. Information makes people feel more in control, so they want an innovator/leader that is also highly structured, is detailed in reporting, and makes few independent decisions. In other words: the exact opposite of innovator/leader. The job description of CIO keeps shifting from "leader" to "manager," from "innovator" to "bean counter" (check the requirements and descriptions for recent job postings). The CIO looks a lot less like a technologist, the IT group functions more like custodial services, and so on until even the most ambitious technologists become stagnant and cynical. This is the erosive force that creates the stereotypical IT environment we have come to expect, leaving organizations so unable to cope with change that everything becomes "disruptive." Dilbert isn't a comic strip; it's a documentary on this process.
The academic example is important because it shows that even those with a deep appreciation for the value of innovation can fall into the same trap.
Can it be changed? In short, no, not quickly. However, the optimist in me says that if just one person can take one small step out of their comfort zone for just one decision -- that can change everything.
Technologists would be wise to take note: Save your anger and blame. If you want to truly help yourself, and your organization, then do the best work you can and influence everything you can. When the opportunities arise, organize your cohorts to make the outcome positive. If that's the hiring of a CIO, be there, tell them what you want, comment on the job posting, help write it if you can. Do it for every job posting. Get people to take just one leap. Be unified, be clear and be passionate that the way forward is going to be uncomfortable, but lucrative. But again, save your anger and remember why most people wind up stuck in their comfort zones to begin with: They are being lied to, all the time, every day -- but we'll talk about that in Part 4.
It is ironic, but a technologist's highest calling may very well be the de-programming of humans.
The uninspired state of business and technology: If Einstein worked in IT today, he'd be playing Portal instead of inventing the concept.
Jeff Ello gives technology a nudge for the Krannert School of Management at Purdue University. He can be contacted at firstname.lastname@example.org, LinkedIn, Twitter and just about everywhere else.
This story, "The unspoken truth about IT, Part 3: The comfort zone" was originally published by Computerworld.