January 02, 2012, 11:25 AM — Just before the start of my holiday vacation, I went ahead and updated my old Froyo Android phone to the new Galaxy Nexus running Ice Cream Sandwich, promptly becoming a statistic in a new trend: buying new hardware to update software.
This trend was identified most recently by Danny Sullivan over at Marketing Land, who makes a great argument that the fragmentation of Android is almost a moot point, given the rapid update cycle many consumers have with their mobile devices.
"That's probably been the saving grace for Android. Unlike PCs, the hardware specs for smartphones seem to change so rapidly. We're trained to think that phones should get tossed out after two years--and for those who want to pay more for the latest-and-greatest--to upgrade our hardware each year."
(Sullivan goes on to masterfully argue that Android's fragmentation should still be cleaned up, because there are many examples where app developers are getting stymied because they don't have a clear target on the platform on which to develop.)
I would encourage you to read Sullivan's take on Android--it's good stuff--but I would also like to stick with this broader issue of rapid hardware adoption and how it relates to open source and software development in general, because I think it represents a massive shift in the way things will get done in the software industry.
In the past, as hardware was updated by manufacturers, software developers didn't have too many issues with this: they would simply have to target the operating system on which their application would be running, be it Windows, Linux, or Mac OS/OS X. The cutting-edge application developers, who made productivity and game programs that needed to squeeze every last drop of processor and memory power a PC had, would pay more attention to hardware specs but for the most part operating systems would level the development playing field.
This is no longer the case, it seems. Now the entire operating system is contingent on the specs of the device and the marketing whims of the device manufacturer and the operating system is no longer a safe haven for app developers.
There have been two primary market responses to this. First, there's the Android open policy, which lets manufacturers build out whatever they want and let the apps developers figure it out. Then there's the Apple response, which is to offer developers a more stable platform by keeping updates to the operating system more unified across devices… but developers have to pay a larger premium.
The phenomenon is also encroaching on "traditional" PC platforms. Google's ChromeOS offers a more stable, unified platform (at least, compared to Android), but it is still updated far more rapidly than its Windows and OS X counterparts, thanks to it's Linux ancestry.
In the Linux ecosystem, of course, rapid platform development has been the mainstay for quite a while. Ubuntu and Fedora are at a six-month release cycle, and openSUSE isn't far behind at its eight-month cycle. FLOSS applications, such as Firefox, the Chrome browser, and LibreOffice have also implemented rapid-release cycles.
What is interesting to me is that this rapid-release "fragmented" approach to operating system updates is the very thing that Linux used to get smacked around about at the turn of the century. Indeed, the Linux Standard Base and other standardization approaches were supposed to mitigate the "problem" of fragmentation by offering developers a safe platform towards which they could code.
Nowadays, however, everywhere we look, its almost an every platform for themselves approach. Instead of unification, it seems that operating system vendors are doing what they want to do and letting the application developers sort it out amongst themselves.
There is evidence the enterprise marketplace is not ready for this approach… Red Hat Enterprise Linux and SUSE Linux Enterprise Server are still updated at a slower pace. But as operating system vendors look beyond the enterprise, rapid-release seems to be the only approach.
This, I believe, is why we are seeing so much push for HTML5 and non-native app development. "Native" app development has become too treacherous and convoluted a journey to navigate, so developers are now seeking out another level playing field--this time web development.
Is this all ultimately a bad thing? To be honest, I am not sure. Competition breeds innovation, they say, but this everyone-for-themselves approach seems to be counterproductive.
The Linuxification of the operating system landscape seems to have succeeded, but now we need to figure out how to live with the fruits of that success.
Read more of Brian Proffitt's Open for Discussion blog and follow the latest IT news at ITworld. Drop Brian a line or follow Brian on Twitter at @TheTechScribe. For the latest IT news, analysis and how-tos, follow ITworld on Twitter and Facebook.