Privacy group files FTC complaint over Facebook's 'emotional contagion' study

EPIC contends that Facebook 'purposefully messed with people's minds' when it manipulated their news feeds to measure emotional responses

By , IDG News Service |  Legal

Facebook "purposefully messed with people's minds" in a "secretive and non-consensual" study on nearly 700,000 users whose emotions were intentionally manipulated when the company altered their news feeds for research purposes, a digital privacy rights group charges in a complaint filed with the U.S. Federal Trade Commission.

The Electronic Privacy Information Center filed the complaint Thursday, asking the FTC to impose sanctions on Facebook. The study violated terms of a 20-year consent decree that requires the social-networking company must protect its users' privacy, EPIC said. EPIC also wants Facebook to be forced to disclose the algorithms it uses to determine what appears in users' news feeds.

The complaint follows days of mounting outrage from privacy-rights advocates and Facebook users -- some of whom are quoted in the EPIC complaint -- after results of the study, published June 2 by the Proceedings of the National Academy of Sciences, became widely known. Researchers from Facebook, the University of California, San Francisco and Cornell University conducted the study from Jan. 11 to Jan. 18, 2012, on 689,003 English-speaking Facebook users. The study was, however, for Facebook's internal purposes.

The research sought to show whether emotions can be influenced with no face-to-face contact by altering Facebook's algorithm to show mostly positive or negative posts. Scientists call that "emotional contagion." The study found that people whose news feeds contained more positive comments tended to make more positive comments and those who took in more negative posts were more bummed out in their own posts.

When news of the study emerged, researchers and eventually Facebook COO Sheryl Sandberg said that the research wasn't explained well by the company, though the apologies struck many as well short of an actual mea culpa.

PNAS editor-in-chief Inder M. Verma published an "editorial expression of concern" on Wednesday regarding the study, saying that researchers contend it was consistent with Facebook's data use policy so that when users sign up, they are giving "informed consent" that their data might be used in research. Because the research was conducted internally by Facebook, it did not fall under the auspices of Cornell's Human Research Protection Program, the statement says.

The statement further notes that as a private company Facebook is under no obligation to follow what is known as the "common rule" among researchers, to obtain informed consent from study participants and allow them to opt out if they don't want to be part of research.

Join us:
Facebook

Twitter

Pinterest

Tumblr

LinkedIn

Google+

Join us:
Facebook

Twitter

Pinterest

Tumblr

LinkedIn

Google+

Ask a Question
randomness