Submit to Digest

Facebook’s experiment of emotional contagion raises concerns

Privacy
Facebook’s experiment of emotional contagion raises concerns By Jenny Choi – Edited by Sarah O’Loughlin [caption id="attachment_4067" align="alignleft" width="150"]Photo By: mkhmarketing - CC BY 2.0 Photo By: mkhmarketing - CC BY 2.0[/caption] On June 17, 2014, Proceedings of the National Academy of Sciences released a study to test emotional contagion through an experiment on Facebook users.   The published study was titled “Experimental Evidence of Massive-Scale Emotional Contagion through Social Networks” and was conducted by Adam D. I. Kramer, Jamie E. Guilory, and Jeffrey T. Hancock.  The experiment took place for one week (January 11-18, 2012) and roughly 155,000 participants were randomly selected based on their User ID.  To test whether emotional states are contagious through verbal expressions, Facebook reduced positive and negative posts on News Feeds to observe any changes in the participants’ posts. While ARS Technica defended Facebook’s experiment, most news articles criticized it for violating users’ privacy without their informed consent.  More can be found in the Independent, InformationWeek, Forbes, Washington Post and Atlantic.  The published results of the experiment can be found here. According to the study, when users saw more negative posts on their News Feeds, by Facebook reducing positive posts, they were more likely to post negative statuses.  When they saw more positive posts on their News Feeds, they were more likely post positive statuses.  Finally, when Facebook reduced both positive and negative posts in some users’ News Feeds, the users reduced the amount of words used in their posts.    While the experiment lasted only for a week and used a small subset of the total users, many questioned how the following phrase out of 9,045 words in Facebook’s Data Use Policy could be sufficient in securing informed consent: “We use the information for… troubleshooting, data analysis, testing, research, and service improvement” (emphasis added). The Electronic Privacy Information Center recently filed a complaint with the Federal Trade Commission against Facebook for deceiving users and violating a 2012 Consent Order.  In the complaint, the EPIC alleged the following: (1) Facebook represented that its news feed rankings are based on content-neutral factors; (2) Facebook violated the FTC’s 2012 Consent order by misrepresenting its data collection practices and made information accessible to third parties; (3) Facebook Core Data Study used data from users’ News Feeds to manipulate users’ emotions. In response to these criticisms, Adam Kramer, Facebook data scientist who conducted the experiment, explained that Facebook wanted to “investigate the common worry that seeing friends post positive content le[d] to people feeling negative or left out” and the risk of people avoid visiting Facebook due to exposure to their friends’ negativity.  Sheryl Sandberg, Facebook’s Chief Operating Officer, apologized for poorly communicating to users about the study and mentioned that Facebook never intended to upset users. However, many privacy advocates and users are still upset.  Robert Klitzman, a psychiatrist and ethics professor, mentioned in CNN: “the problem is not only how the study was described, but how it was conducted.”  He argued that because two authors of the study were affiliated with universities, they should have complied with the National Research Act, which requires informed consent of participants.  The public’s strong reaction to Facebook’s experiment on its users shows how companies in the technology industry should no longer assume that their terms of service is sufficient for securing informed consent.