Analysts debate effect of Facebook's policy changes on users
As Facebook pushes users out of the decision-making loop, what's it mean?
Facebook previously had a rule that any of its proposed policy changes that attracted 7,000 "substantive" comments would be put to a vote. That will no longer going be the case.
"In the past, your substantive feedback has led to changes to the proposals we made," Facebook said in a post. "However, we found that the voting mechanism, which is triggered by a specific number of comments, actually resulted in a system that incentivized the quantity of comments over their quality."
The move first propelled the Data Protection Commissioner in Ireland, where Facebook's European Union headquarters is situated, to contact the social network for a clarification of its position.
On Tuesday, the Electronic Privacy Information Center and the Center for Digital Democracy teamed up to ask Facebook to withdraw the changes, saying that users have a right to participate in Facebook's governance.
On the heels of Facebook's announcement last week, a rampant "Facebook copyright" hoax made its way across the social network. The hoax seemed to build on users' privacy fears.
Patrick Moorhead, an analyst with Moor Insights & Strategy, called this latest move another example of Facebook's "messing" with its users.
"Facebook got in way too deep by making a commitment they couldn't or didn't want to keep," Moorhead said. "I don't believe too many users even knew they had a voting system, but that's missing the point. It's the principle of the matter, which is about consistency of service."
Moorhead also noted that if Facebook goes through with the policy change, it will be bad news for users over the long haul, leaving them without a "democratic mechanism" to air any grievances.
However, Dan Olds, an analyst with The Gabriel Consulting Group, said, the proposed change may not be as bad as it seems.
"Heavily involved Facebook users are sure to take umbrage with what seems, on the surface, to be a sharp slap in the face," Old said. "But I have to say that I think maybe Facebook has a point here. The voting policy was written when the site was much smaller. Now that they're pushing a billion members, the initial voting numbers don't make a lot of sense."
For example, Olds noted that it only takes 7,000 users to force a vote on a Facebook privacy issue. With more than 1 billion worldwide users, that is one thousandth of 1% of total users.
"If the same principle was applied to the United States, it would mean that 2,100 hotheads could force a nationwide vote on whatever issue has them all hot and bothered," he added. "Now here's where it gets even more absurd. In order for the vote to count, more than 30% of all users have to participate in the balloting. With Facebook, that means 300 million or so votes would have to be tallied in order for results to be valid."
Brad Shimmin, an analyst with Current Analysis, said users will have to get used to the idea that Facebook has to please shareholders as well as them. This move, he said, is part of the maturing relationship between the company and its users.
"Now that Facebook is a publicly traded company, it must split its attention between investors and users, two parties that don't always share the same objectives," Shimmin said. "So I'm not surprised to hear that Facebook's pulling back from its former, less formal, stance on revising privacy policies. What users often want, which is complete freedom of speech and ownership of content, stand at odds with what Facebook needs in order to attract advertisers and investors."
He added that Facebook, in the past, hasn't significantly altered its privacy course based on user feedback, and isn't likely to this time either.
"I think, if anything, it's a wakeup call to the Facebook user base, alerting them to the fact that they are ultimately guests of Facebook, and therefore Facebook has the right to set the rules in its own house," he said.
Olds agreed that whether or not users have a right to vote isn't a big deal. There are other ways for users to show their displeasure that would make Facebook take notice.
"Facebook is large enough now that any move they make gets lots of scrutiny," he said. "I think the bad press associated with Facebook making a bad privacy protection move is much more powerful than the results of a vote."
Sharon Gaudin covers the Internet and Web 2.0, emerging technologies, and desktop and laptop chips for Computerworld. Follow Sharon on Twitter at @sgaudin, on Google+ or subscribe to Sharon's RSS feed. Her email address is firstname.lastname@example.org.
Read more about social media in Computerworld's Social Media Topic Center.