This is real. Facebook has admitted that its researchers conducted a formal scientific experiment on over 680,000 Facebook users, without their knowledge or consent. The results were published in the Proceedings of the National Academies of Science. The researchers concluded that “emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness.”  In other words, Facebook’s researchers started “emotional contagions” among their users and studied the results.
What they did was to alter the News Feed of hundreds of thousands of users in order to “manipulate” [their word] their emotions. They omitted certain posts from the user’s News Feed, which would have otherwise been shown, based on whether that post was determined (by keyword analysis) to be emotionally positive or negative. Next, they monitored the posts of that user, looking for keywords indicating a positive or negative response in the user.
They found that the emotions of Facebook users were successfully manipulated by this test. They made some users feel better and others feel worse by changing their News Feed to reduce the positive or negative posts that they were viewing. The study concluded that “emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness.” Yes, without their awareness.
Oh, and if you are a researcher interested in this topic, they will make their data available for your analysis also: “Data processing systems, per-user aggregates, and anonymized results available upon request.”
Various news stories are asking if this was ethical.
Facebook defends this study, saying that manipulating hundreds of thousands of users’ emotions “was consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research.”  When did Facebook users agree to have their emotions manipulated? When they checked a box, in signing up for Facebook, permitting “data use” that includes: “internal operations, including troubleshooting, data analysis, testing, research and service improvement.” But it’s quite a stretch to say that “data use” includes manipulating users emotions.
Is there any kind of ethics review board that oversees this type of research? Sort of. Facebook has its own internal review board. So the Facebook experiment was deemed ethical because a committee of Facebook employees decided that it was OK. That’s circular reasoning: It’s ethical because we decided it’s ethical.
And according to an article in the Atlantic, the editor of the study said that, in addition, a “local institutional review board had approved it — and apparently on the grounds that Facebook apparently manipulates people’s News Feeds all the time”  In other words, it’s ethical because they do it all the time. Rob one bank, and that’s wrong. Rob banks on a regular basis, and it’s fine.
The lead author defended his experiment in a Facebook post: “The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product….”  It’s as if he is saying: We care about our users, so we experimented on their emotions. Wait, that’s exactly what he is saying.
We treat you like property, because we care. If you want to use Facebook, you have to agree to let us experiment on you and manipulate your emotions. We must do this type of testing and research for the sake of “service improvement”. In the new information age, it’s all good. But the more frequently that large internet companies get away with treating their users like their property, the less outrage there will be over time.
 Kramer et al., Experimental evidence of massive-scale emotional contagion through social networks; Proceedings of the National Academy of Sciences. Vol. 111, no. 24.
 CNET, How Facebook conducts experiments on your emotions, 29 June 2014.
 The Atlantic, Even the Editor of Facebook’s Mood Study Thought It Was Creepy, 28 June 2014.