Connect with us


Facebook’s Unethical Experiment Intentionally Manipulated The Emotions Of Thousands Of Its Users



The results of a secret psychological experiment conducted on nearly 700 thousand Facebook users without their consent recently came to light. The fact that the experiment was aimed at studying the emotions of the members of the social network without them knowing it has caused online outrage.

In January 2012, the number of positive and negative posts displayed in the news feeds of 690,000 Facebook users was secretly manipulated. The next step was to collect and analyze comments and statuses written by the unsuspecting participants of the experiment. The study concluded that those who had less positive content in their news feeds eventually expressed negative mood themselves. In contrast, those who saw less negative content tend to use fewer words with negatively charged emotions in their own posts. The experiment was conducted with a random sample of Facebook users and the results were published in the PNAS journal on June 17, 2014.

In other words, Facebook intentionally made well over half a million of its users sad.

The study’s authors came to the conclusion that the emotions experienced by users online are influenced by the posts they read, which can also affect their behavior in real life. As they said,“Emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks, and providing support for previously contested claims that emotions spread via contagion through a network.”

The load of negative comments that flooded Twitter and Facebook after the results of the experiment were published prompted one of the study’s authors to offer an explanation. Facebook data scientist Adam Kramer wrote in his profile:

The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product. We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook. We didn’t clearly state our motivations in the paper.”

If you wonder if this unethical experiment was legal, it ‘seems’ that it was.. Sort of.. As the study paper states, the research “was consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research.”

Here is the relevant section of Facebook’s data use policy: “For example, in addition to helping people see and find things that you do and share, we may use the information we receive about you … for internal operations, including troubleshooting, data analysis, testing, research and service improvement.

The methodology of Facebook raises serious ethical questions.. As states: ‘The team may have bent research standards too far, possibly overstepping criteria enshrined in federal law and human rights declarations. “If you are exposing people to something that causes changes in psychological status, that’s experimentation,” says James Grimmelmann, a professor of technology and the law at the University of Maryland. “This is the kind of thing that would require informed consent.” ‘

The only mention of “informed consent” in the paper was the data use policy that we all agree to when creating an account. But this is not how most social scientists define “informed consent”.

However, the fact that Facebook uses its members as guinea pigs raises many questions in relation to the number and content of behavioral experiments that may have already been conducted on the social network by similar methods.

Update: Facebook mind control experiments linked to DoD research on civil unrest


Anna LeMind is the owner and lead editor of the website, and a staff writer for The Mind Unleashed.

Like this article? Get the latest from The Mind Unleashed in your inbox. Sign up right here.

Typos, corrections and/or news tips? Email us at