February 23, 2011, 2:15 PM —
When I read with disappointed interest the news that the German Foreign Office had decided to migrate back to Windows after having deployed Linux on their office desktops in 2005, my immediate reaction was one of resignation.
That's a change from my younger days, when I would see such reverse migrations as carefully staged Microsoft PR events designed purposely to embarrass Linux on the desktop. I would hear comments such as "it was too hard to train users" on Linux and cries of "FUD!" and "Hokum!" would sail out from my keyboard.
Training people to use Linux, I would reason, is no harder than training them to use Windows or OS X. Any statements to the contrary, I would further argue, were merely playing to the cameras. Such brashness came from my experience in learning all sorts of platforms from scratch. And there's where I went wrong.
I have had, in the past year or so, the opportunity to train students in computer use at the undergraduate level. You would think, being university students, that any basic training I would need to give them would be superfluous. You would be mistaken.
It's not that students are computer illiterate--far from it. They can execute applications, particularly web-based apps, with practiced ease. But when asked to accomplish simple tasks with local UI tools (such as right clicking to bring up a context menu), I get far more surprised looks than I should expect.
This is not to challenge their education prior to coming to my class. It's a demonstration of just how siloed any computer experience can be. Most users of computers learn to do things by rote, and do not often go far afield from the tasks they've picked up along the way.
I am lucky. My students are bright and they're in my class to learn new things. But I think about white-collar workers that have been away from college or any other sort of training for a while, who tend to be less flexible about technology use, and now I am not as high-and-mighty about the whole training issue.
This has little to do with the usability of Linux, per se, though any efforts to improve Linux usability should be made, since lowering the bar to adoption on any front is a good idea. This has to do with the average user's computer use. They learned on Windows or Mac OS X because that's what they likely saw in school. That's what they have at home. That's what they used in their last job.
They got used to it. They picked up a few tricks along the way to make things easier. But they likely did not learn about the whys and hows of what that interface did.
The web, if anything, has made it worse. Web apps are self contained and don't typically use a lot of menus or dialog boxes. They are single-function tools, or if multi-function, just a few functions at a time. They are streamlined because they have to be downloaded and run inside a browser every single time. Web 2.0 is a boon for businesses to deliver a "perfect" customer experience, but Web 2.0 apps have simplified user experience to the point that any changes to the operating system/user interface are more difficult to get around.
Curiously, it may be Web 2.0 that gets around this problem. As web apps get more complex, you won't need as many apps at the local OS level. You'll just need a browser and whatever platform underneath. That's the space Linux will ultimately live in, and training--at least at the OS level--may be a moot point.
But will it be what we could call "desktop Linux"? That remains to be seen.