In the tech world and, increasingly, in the normal one, a short attention span is a survival characteristic
There is no shortage of mental illnesses within corporate IT and the computer industry.
Some are real, like the high incidence of autism among programmers DBAs, sysadmins and others with jobs requiring maximum detail-orientation and minimum interpersonal contact.
Some are metaphors used to explain system meltdowns or complex failures of service providers using language more polite than the red-faced profanity that is sometimes the most concise way to express the impact of massive project-management cluster-faults, idiopathic system failures, vendor finger-pointing matches and other common trials of life in the data pit.
Chronic anxiety, depression, confusion and obsessive worry are all common to tech workers, who are just as stressed by any major changes as non-tech workers, but whose working environment offers so little distance between the cutting edge and the pit of obsolescence that many are afraid to do more than just stand in one place and try not to get tossed too far one direction or the other.
The unrelenting pace of change – not to mention the cyclical mass layoffs caused by the collapse of another economic bubble every five or six years – create insecurity and neuroses so common and intense that the trademark business rule of former Intel CEO Andrew Grove – "Only the paranoid survive." – is considered realistic advice, not evidence of a psychotic break with reality.
Among "normal" non-geeks, Attention Deficit Disorder (ADD) has become so common that a shortage of Adderall (amphetamines) to treat it becomes a national crisis.
That doesn't mean it's always present, even among those who claim to have it or whose jobs require that they juggle so many balls at once they behave as if they do.
Also like autism, the downside of ADD is serious enough that its difficulties usually outweigh its benefits.
You may already be a mutant
Studies showing drivers who take their eyes off the road to type or read a text are more likely to crash than drivers who are drunk have nothing on teenagers who are so eminently distractible that even those without ADD or ADHD often crash because they're paying attention to a million things at once without pushing "eyes on the road" to the top of their priority list.
Teenagers speed a lot because they overestimate their skills behind the wheel; teenagers with ADHD have far more serious versions of the same problem, intensified by a greater tendency to take risks and act impulsively than teens without ADHD, according to Russell A. Barkley, a researcher at the Medical University of South Carolina who has turned educating the public on ADD and ADHD into a cottage industry.
Barkley was a principal source in a NYT story today whose conclusion was that teens with ADD and ADHD can have such a tough time learning to drive despite a cognitive handicap that it's often best that they just don't learn at all, at least for a few years.
That may not only be a copout – using denial to delay a risk rather than addressing it – it may be on the wrong side of a major historical trend, a trend the abused, harried, short-attention-span-suffering folk in the tech world have been coping with for years longer than non-geeks.
A new study from Northwestern Medicine shows the number of American kids diagnosed with ADHD has risen 66 percent during the past 10 years.
There are plenty of people who identify the ADHD epidemic as cultural rather than medical. Parents want children with attention spans longer than children are typically able to muster, or to get kids to overachieve in both school and extracurriculars even without the obsessive need for recognition common in "natural" overachievers.
That may invalidate some percentage of the diagnoses. It doesn't reduce the impression created by 12 billion other studies showing that many people living in heavily digitized information societies are so awash in data they can't keep it straight on their hard drives let alone in their heads.
It doesn't change the need many feel to keep up with so much "new" information ranging from emergency news bulletins to the latest viral video that they spend more time absorbing media than thinking or talking about it.
It doesn't change the plasticity of the human brain – the ability to recognize a new type of stress or stimulus and adapt its levels of perception to deal with the new influx.
It doesn't change the physical limitations that keep humans from doubling, quadrupling or octupling the capacity of their internal I/O busses to keep up with the constantly increasing flow of data – as we do regularly to the hardware that brings the data to us.
And it doesn't change the basic information-processing architecture of the human brain, which can pay strict attention to only one thing at a time and which devotes tremendous resources to identifying and filtering out things that might distract us from the task at hand.
Given those limitations, the only choice is not to change our brains to allow us to multitask. The only option is to learn how to switch from one mono-task to another more quickly and efficiently, wasting less time and mental energy with each change in the focus of attention.
The quick changes in focus common to those with both ADD and ADHD tend to be involuntary and to make it harder to finish any one project than to work on half a dozen that may or may not ever be completed. The laserlike mono-focus those with both conditions can generate to cope with complex tasks often locks ADDers into long bouts of work on unproductive mini-obsessions rather than marathon bouts of productive work.
Still, the identifying characteristic of both ADD and ADHD is a thought style and work process that look a lot like fast, high-efficiency topic switching.
Fear of the future: accurate, but not justified
Medical diagnoses follow trends and fashions just as clothing choices, baby names, judicial decisions and presidential elections do.
It's undoubtedly true that family physicians are over diagnosing and overprescribing for ADD and ADHD in children.
It's also obvious the society those children will live in has become far more challenging during the next 50 years, especially for a wetware data-processor originally developed to find roots, hunt small mammals and identify danger in time to climb a tree rather than be eaten.
That difficulty is one reason technologies designed to manage personal data for us automatically have proliferated at least as quickly as those designed to build rockets, nuclear power stations or handle the finances of a really big corporation in so complex a way no one would notice if it made a lot of money but had no profit on which to pay taxes.
The fear of a robot apocalypse–the deadly consequence of technology driven so far by human arrogance that the creation destroys its own creator–became famous with Mary Shelley's Frankenstein, but it was around much earlier.
The wings Icarus made with feathers and wax allowed he and his son to fly, but didn't do either much good.
Galileo didn't spend years under house arrest because the Catholic Church was demanding he produce more doctrine-decaying proof the universe wasn't shaped the way the Pope insisted it was.
The original Trojan Horse didn't become one of the most persistent images in Western literature because it made for good product tie-ins with Peloponnesian fast-food innovations like the Kids' Heroic Meal. The image of the Trojan Horse is one of an elegant creation given as a gift, which betrayed those who believed in it and caused them to be killed. (Welcome home, V'Ger.)
It seems funny that people would fear their civilizations would be corrupted by the telegraph or railroad or horseless carriage (or writing, which Socrates worried would destroy the ability of students to memorize everything that was worth knowing).
The truth is all those things did destroy at least large parts of the civilizations that created them, though they did it by building new ones, not simply tearing down the old.
Both writing and Google may destroy the memory, at least compared to the abilities of people who grew up in societies based on oral histories and for whom memorization was the only way to communicate either long-term knowledge or short-term messages.
That doesn't mean Google is making us stupid, as Nicholas Carr suggested.
However, Google – and Facebook and Twitter and ubiquitous GPS, text, email and search capabilities – are making us different.
Brains adapt to changes minds may not grasp
Changes in the way humans perceive, process and store data, which are easily identified by the writings they leave behind complaining about it, cause physiological changes in the brain. When challenged by something it can't remember, perceive or understand, the brain reconfigures the interconnection patterns among neurons to accommodate increases in the flow of information or the type of information it has to process.
Even the cells in the brain that are not neurons, which have traditionally been thought of only as a platform on which the all-important neurons operate, are now believed to play an important role in thinking and memory. THey also reconfigure themselves to adapt to new demands.
That's not to say new technology makes direct changes in human brains. What it does is prompt new ways of thinking that expand beyond existing limits and create new potential for growth to which human brains must adapt in order to take advantage of the potential they can now see.
The idea that the configuration of human brains changes enough from one generation to the next to be able to deal with new challenges would counter the assumption that cave-dwelling Homo Sapiens should be able to function perfectly well in modern society if they educated like modern children after they were thawed out and revived (barring tragedies, like Encino Man).
It's likely there would have been so many small evolutionary changes in the micro-structures of the brain that a single caveman couldn't adapt within one lifetime.
That idea also suggests that, while far more kids may be diagnosed with ADD or other cognitive disorders than actually have a disorder, that some of the diagnoses are off base for reasons more significant than simply that the doctor didn't get the facts right.
Diagnoses may be wrong because they define short attention spans, compulsive scanning of the information environment and constant alertness for the increasingly important Next New Thing as a disorder.
It may not be a disorder. It may be an evolutionary advantage.
It may be the hallmark of Homo Sapien Technoenses: Geek Man.
Even more important, it could be a clear indication that humans have progressed to the point that they can finally see that the solution to the most persistent global problems is – Hey, look, a squirrel! Gotta go.