One of the controversial comments Steve Jobs made (does the man really make too many other kinds of comments?) during Apple’s last financials call about Android excoriated the platform and its fragmentation issues. Google has denied fragmentation as a serious problem. Today, Netflix actually blamed fragmentation for its delay in releasing an Android that offers the same streaming features of its iOS apps.
So, who’s right? Is fragmentation a problem or not?
As much as I’d like to give a definitive answer to the question, it really does depend who you asking.
I spent the weekend hanging out with some hard core Android fans (plus a couple of casual Android users). None of them saw fragmentation as a problem at all. Among they’re various devices (all very recent handsets), they’d never found fragmentation to be a problem when working with their phones or finding/downloading apps. They acknowledged that they didn’t have much control over when or how updates were applied to their devices (always a combination dictated by a combination of manufacturer and carrier availability), but didn’t see that as a big problem.
While Netflix may say fragmentation is an issue TweetDeck denied it being a major problem after Jobs included the compnay in his original statement about fragmentation.
On the other hand IT personnel charged with supporting mobile devices find fragmentation, particularly when it comes to being able to accurately predict the version of the OS on a device as a large problem. This is, of course, because different version of Android support varying enterprise and security features. I discovered a lot of concern about fragmentation while researching various mobile device management tools this summer (expressed by both IT pros and some staff at the companies developing tools to secure and manage devices).
While I can’t speak from a developer perspective (the last time I wrote a mobile it app it was for the original Palm OS), I think the perception of fragmentation as a problem really depends on whether you’re coming at it from a user or IT perspective.
So long as user find the apps they want or need, they will not see it as a problem. Their device work and that’s all they want/need. IT, on the other hand, has to be concerned with data security, the ability to support specific features, and potentially the need to vet devices and apps for use on corporate networks. That’s a lot easier with a single monolithic approach to device design, vendor specification and support, and OS patch management.
Ironically, this isn’t that much different from how users view their home PCs and IT views their work PCs. The same holds true for access to apps: either a PC supports running the apps or it doesn’t as a factor of age, hardware, OS version, and so forth. The difference is that the average person doesn’t bring their personal PC into the office while the average Android user may (and may need to access work-related resources).
Therefore, as I said, it’s hard to make a definitive judgment as to whether fragmentation presents a real problem today or whether it will in the future. For consumers, it may simply mean some apps and OS features aren’t available and that’s ok. For businesses, IT, and regulatory bodies, that probably isn’t ok.
Like I said, it depends on who you ask. So, what do you think – is fragmentation a major problem for home and/or office?