On Facebook, people frequently express emotions, which are later seen by their friends via Facebook's "News Feed" product (8). Because people's friends frequently produce much more content than one person can view, the News Feed filters posts, stories, and activities undertaken by friends. News Feed is the primary manner by which people see content that friends share. Which content is shown or omitted in the News Feed is determined via a ranking algorithm that Facebook continually develops and tests in the interest of showing viewers the content they will find most relevant and engaging. One such test is reported in this study: A test of whether posts with emotional content are more engaging.
The experiment manipulated the extent to which people (N = 689,003) were exposed to emotional expressions in their News Feed. This tested whether exposure to emotions led people to change their own posting behaviors, in particular whether exposure to emotional content led people to post content that was consistent with the exposure—thereby testing whether exposure to verbal affective expressions leads to similar verbal expressions, a form of emotional contagion. People who viewed Facebook in English were qualified for selection into the experiment. Two parallel experiments were conducted for positive and negative emotion: One in which exposure to friends' positive emotional content in their News Feed was reduced, and one in which exposure to negative emotional content in their News Feed was reduced. In these conditions, when a person loaded their News Feed, posts that contained emotional content of the relevant emotional valence, each emotional post had between a 10% and 90% chance (based on their User ID) of being omitted from their News Feed for that specific viewing. It is important to note that this content was always available by viewing a friend's content directly by going to that friend's "wall" or "timeline," rather than via the News Feed. Further, the omitted content may have appeared on prior or subsequent views of the News Feed. Finally, the experiment did not affect any direct messages sent from one user to another.
Posts were determined to be positive or negative if they contained at least one positive or negative word, as defined by Linguistic Inquiry and Word Count software (LIWC2007) (9) word counting system, which correlates with self-reported and physiological measures of well-being, and has been used in prior research on emotional expression (7, 8, 10). LIWC was adapted to run on the Hadoop Map/Reduce system (11) and in the News Feed filtering system, such that no text was seen by the researchers. As such, it was consistent with Facebook's Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research. Both experiments had a control condition, in which a similar proportion of posts in their News Feed were omitted entirely at random (i.e., without respect to emotional content). Separate control conditions were necessary as 22.4% of posts contained negative words, whereas 46.8% of posts contained positive words. So for a person for whom 10% of posts containing positive content were omitted, an appropriate control would withhold 10% of 46.8% (i.e., 4.68%) of posts at random, compared with omitting only 2.24% of the News Feed in the negativity-reduced control.
The experiments took place for 1 wk (January 11–18, 2012). Participants were randomly selected based on their User ID, resulting in a total of ∼155,000 participants per condition who posted at least one status update during the experimental period….
The results show emotional contagion. As Fig. 1 illustrates, for people who had positive content reduced in their News Feed, a larger percentage of words in people's status updates were negative and a smaller percentage were positive. When negativity was reduced, the opposite pattern occurred. These results suggest that the emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks (3, 7, 8), and providing support for previously contested claims that emotions spread via contagion through a network......
Unfortunately Facebook Inc does not appear to understand that involving 689,003 English-speaking people in an online psychological experiment without their knowledge, one which sought to manipulate their emotional states (and therefore their sense of wellbeing) and, then calmly and erroneously telling the world that the Data Use Policy terms and conditions imposed by the company when anyone creates a personal Facebook account was in fact "informed consent" for this experiment, is an appalling abuse of power.
On 29 June 2014 The Wall Street Journal reported these responses:
What many of us feared is already a reality: Facebook is using us as lab rats, and not just to figure out which ads we'll respond to but actually change our emotions," wrote Animalnewyork.com, a blog post that drew attention to the study Friday morning.
"It's
completely unacceptable for the terms of service to force everybody on Facebook
to participate in experiments," said Kate Crawford, visiting professor at
MIT's Center for Civic Media and principal researcher at Microsoft Research.
Ms. Crawford
said it points to broader problem in the data science industry. Ethics are not
"a major part of the education of data scientists and it clearly needs to
be," she said.
Asked a
Forbes.com blogger: "Is it okay for Facebook to play mind games with us
for science? It's a cool finding, but manipulating unknowing users' emotional
states to get there puts Facebook's big toe on that creepy line."
Slate.com
called the experiment "unethical" and said "Facebook
intentionally made thousands upon thousands of people sad."
Mr. Kramer
defended the ethics of the project. He apologized for wording in the published
study that he said might have made the experiment seem sinister. "And at
the end of the day, the actual impact on people in the experiment was the
minimal amount to statistically detect it," he wrote on Facebook.
Facebook also
said the study was conducted anonymously, so researchers could not learn the
names of the research subjects.
Mr. Kramer
said that the content—both positive and negative—that was removed from some
users' news feeds might have reappeared later.
The emotional
changes in the research subjects was small. For instance, people who saw fewer
positive posts only reduced the number of their own positive posts by a tenth
of a percent.
Comments from
Facebook users poured in Sunday evening on Mr. Kramer's Facebook page. The
comments were wide-ranging, from people who had no problem with the content, to
those who thought Facebook should respond by donating money to help people who
struggle with mental health issues.
"I
appreciate the statement," one user wrote. "But emotional
manipulation is emotional manipulation, no matter how small of a sample it
affected."
Perhaps Facebook users need to step back and consider what this experiment says about both the people running this giant social media platform and the staff they employ.
Do you really want Danger Muffin (left) aka Adam D. I. Kramer messing with your head just because he can?
British
regulators are investigating revelations that Facebook treated hordes of its
users like laboratory rats in an experiment probing into their emotions.
The
Information Commissioner's Office said Wednesday that it wants to learn more
about the circumstances underlying a 2-year-old study carried out by two U.S.
universities and the world's largest social network.
The inquiry
is being coordinated with authorities in Ireland, where Facebook has
headquarters for its European operations, as well as with French regulators.
This is just
the latest in a string of incidents that have raised questions about whether
the privacy rights of Facebook's nearly 1.3 billion users are being trampled by
the company's drive to dissect data and promote behavior that could help sell
more online advertising.