With the help of the FTC, Facebook appears to have finally seen the light and cleaned up its privacy act. True, that light may well be emanating from a $100 billion initial public offering steaming toward us like a runaway train, but it’s a light none the less.
There’s no doubt in anyone’s mind the world’s largest social network was motivated to settle with the FTC to clear the books for the world’s largest IPO. The biggest change? The next time Facebook decides to unilaterally alter some setting that makes your private data not so private any more, it must first obtain your consent.
Yes, Virginia, there is an Opt In clause after all.
The second biggest change is that Facebook, like Twitter and Google before it, now must undergo periodic privacy audits by the FTC for the next twenty years.
The third interesting change: When you tell Facebook to delete your account, Facebook must actually delete it within 30 days, instead of letting your photos and other posts bounce around for months afterward. (Does this rule apply to information gathered by Facebook that isn’t publicly available, and does it apply to requests from law enforcement as well? I’m still waiting on an answer from the FTC on those counts.)
[Update: And the answer is... probably not, though the Feds could look into this if circumstances warrant. Please consult your nearest privacy attorney for further clarification.]
What’s notable to me is the contrast between the official FTC statement – a marvel of clarity for any organization, let alone a Federal bureaucracy – and the muddled (and frankly, BS-filled) blog post from Mark Zuckerberg announcing the agreement.
Consider, for example, the FTC’s summary of what are essentially the lies Facebook has told us over the last two years:
- Facebook represented that third-party apps that users' installed would have access only to user information that they needed to operate. In fact, the apps could access nearly all of users' personal data – data the apps didn't need.
- Facebook told users they could restrict sharing of data to limited audiences – for example with "Friends Only." In fact, selecting "Friends Only" did not prevent their information from being shared with third-party applications their friends used.
- Facebook had a "Verified Apps" program & claimed it certified the security of participating apps. It didn't.
- Facebook promised users that it would not share their personal information with advertisers. It did.
- Facebook claimed that when users deactivated or deleted their accounts, their photos and videos would be inaccessible. But Facebook allowed access to the content, even after users had deactivated or deleted their accounts.
- Facebook claimed that it complied with the U.S.- EU Safe Harbor Framework that governs data transfer between the U.S. and the European Union. It didn't.
Doesn’t get much clearer than that.
Zuckerberg, on the other hand, issued a statement that starts out with a baldfaced lie and gets worse from there.
I founded Facebook on the idea that people want to share and connect with people in their lives, but to do this everyone needs complete control over who they share with at all times.
This idea has been the core of Facebook since day one…. We've added many new tools since then: sharing photos, creating groups, commenting on and liking your friends' posts and recently even listening to music or watching videos together. With each new tool, we've added new privacy controls to ensure that you continue to have complete control over who sees everything you share. Because of these tools and controls, most people share many more things today than they did a few years ago.
Overall, I think we have a good history of providing transparency and control over who can see your information.
If the BS piles up any deeper we’ll all have to wear hip boots.
The original FTC complaint stemmed from Facebook’s declaration in December 2009 that certain user information that was once under its users’ control would now be public – period, end of paragraph, and if you don’t like it then leave. A public outcry forced Facebook to back off from that position, slightly.
At one time, Facebook did offer the best privacy controls of any social network around. That was in 2005. It has disintegrated rapidly since. Matt McKeon’s now famous graphic detailing “The Evolution of Privacy on Facebook” tells you everything you need to know on that score.
In fact, the entire history of Facebook can be summed up as follows: Facebook makes an attempt to manipulate or monetize users’ information, users complain, consumer groups sue, Federal officials start mumbling about looking into the matter, Facebook backs off slightly. Then Facebook starts all over again, trying to get the same information using different tactics.
In 2007 it was Beacon. These days it’s “sponsored stories” and “frictionless sharing.” Same wine, different bottle.
Over the last year or so, Facebook has gotten much better. Its privacy controls are still confusing as hell, but it least it has them. When someone (usually in the press) reveals that Facebook is sharing more information than it claims to, Facebook now reacts quickly to fix the problem instead of pretending it doesn’t exist or that people don’t care.
Those are all positive signs. What’s not a good sign: The attitude of the scrawny soon-to-be-$24 billion-richer geek at the top. So I guess we’ll soon find out whether Zuckerberg is truly committed to privacy, or just plans to blow off the FTC once the dust settles on his IPO.
Got a question about social media? TY4NS blogger Dan Tynan may have the answer (and if not, he’ll make something up). Visit his snarky, occasionally NSFW blog eSarcasm or follow him on Twitter: @tynan_on_tech. For the latest IT news, analysis and how-to’s, follow ITworld on Twitter and Facebook.