Facebook’s Psychological Experiment: Happy Or Sad News Feed Affect User’s Moods


FacebookExperimentOn-staff data scientist, Adam Kramer devised an experiment that ultimately concluded the following about the way positive and negative effects travel through social networks:

“In-person interaction and nonverbal cues are not strictly necessary for emotion contagion.”



The experiment, which is now known to have taken place back in January 2012, involved the Facebook News Feeds of nearly 700,000 users. The users were chosen at random and none of them were aware of these occurrences. The goal was to analyze what types of news feed received certain responses, thus altering the content to see what made users happy or sad.
Emotions appeared to be contagious. Who knew? Witnessing Facebook feeds full of happy news and updates make users post positively themselves. The same rings true for sad updates. Although in the tech world, this is similar to person-to-person interaction in that, when you are witnessed to sorrow or grief (or vice versa), the mood reflects in yourself.

FacebookExperiment1Part of Facebook’s reason to have pulled this off is its secret algorithm it uses to control what it shows users on popular news feeds. Already known as an “in the small print” social media network (Facebook has endured a number of privacy cases), allegations are spurring as to whether it went too far in subjecting users via manipulation. Such small print includes a disclosure that it might use information “for internal operations, including troubleshooting, data analysis, testing, research, and service improvement”. Although users must agree to terms and conditions in order to use Facebook, privacy advocates consider these types of agreements to be “carefully worded by design”. Further, Facebook’s specific policy phrasing is intentional, and if something is missing, it is missing for a reason.

Since the research was published in the June 17 issue of Proceedings of the National Academy of Sciences, policy analysts, attorneys, and privacy researchers have been picking through the details. It is one thing to use data to see what users search for, click on, and look at on their devices. However, tweaking an individual’s news feed to manipulate their emotional state seems to be a whole other ball game. In the throws of the Internet world, this could definitely be considered a scandal. Yet Facebook may not have violated the law, or the company’s own policies for that matter. Even if legal documents don’t necessarily equate with morality”, many feel Facebook should have been more ethical.

“No posts were hidden, they just didn’t show up on some loads of Feed”, said Kramer.



Responding to the backlash of this experiment Kramer added, “In hindsight, the research benefits of the paper may not have justified all of this anxiety”. While Facebook may deem this as standard operating procedure in business, there is an inexcusable gap “between the ethical standards of industry practice, versus the research community’s ethical standards for studies involving human subjects” (Ed Felton of Princeton University). It is questioned how the idea got past the ethics committees in the first place. In the end, we find out the answer is yes, if we see a negative status update from a friend, there’s a chance we will start to feel sad ourselves. So what? Was finding out something we already know worth being subject to a secret 700,000-person experiment? Don’t go posting about how upset this made you on Facebook, you may spread the “upset” vibe.

Topics: Technology News Gadgets & Peripherals Inventions & Innovations

Join the conversation!

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.