Tech

Need to know: About Facebook’s emotional contagion study

Jul 2, 2014 /

Ethicists are concerned about the recently published details of a mood alteration experiment conducted on Facebook users. Ready to go beyond the headlines? 5 primary sources worth reading next.

Source: “Experimental evidence of massive-scale emotional contagion through social networks,” Adam Kramer et al, Proceedings of the National Academy of Sciences, June 17, 2014.
Why you should read this: For one week in January 2012, data scientists from Cornell and Facebook experimented on more than half a million people over the age of 13. Their goal? To establish whether or not emotions are contagious online. This paper documents their findings.
Excerpt: “We show, via a massive (N = 689,003) experiment on Facebook, that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness. We provide experimental evidence that emotional contagion occurs without direct interaction between people (exposure to a friend expressing an emotion is sufficient), and in the complete absence of nonverbal cues.”

Source: “In defense of Facebook,” Tal Yarkoni, director of the Psychoinformatics Lab at UT Austin, June 28, 2014.
Why you should read this: Although many businesses prefer to keep their customer research confidential (or at least, out of scientific journals), Facebook is by no means the only large company to work with data scientists, psychologists or anthropologists while A/B testing on consumers. Here, Yarkoni explains what happens every day in corporate America. (Note, he later published a follow-up post, “In defense of in defense of Facebook.”)
Excerpt: “The reality is that Facebook — and virtually every other large company with a major web presence — is constantly conducting large controlled experiments on user behavior. Data scientists and user experience researchers at Facebook, Twitter, Google, etc. routinely run dozens, hundreds or thousands of experiments a day, all of which involve random assignment of users to different conditions. Typically, these manipulations aren’t conducted in order to test basic questions about emotional contagion; they’re conducted with the explicit goal of helping to increase revenue. In other words, if the idea that Facebook would actively try to manipulate your behavior bothers you, you should probably stop reading this right now and go close your account.”

Source: “Facebook and engineering the public,” Zeynep Tufekci, assistant professor at the iSchool at the University of North Carolina, June 29, 2014.
Why you should read this: While ethicists discuss the details of Facebook experimentation, social technologist Zeynep Tufekci is concerned about the broader implications — and wonders why her colleagues within research and academia aren’t more up in arms.
Excerpt: “I’m struck by how this kind of power can be seen as no big deal. Large corporations exist to sell us things, and to impose their interests, and I don’t understand why we as the research/academic community should just think that’s totally fine, or resign to it as ‘the world we live in.’ That is the key strength of independent academia: we can speak up in spite of corporate or government interests.”

Source: “Facebook data policy 2011 vs. 2012,” Kashmir Hill, July 1, 2014.
Why you should read this: “Informed consent” can be a matter of timing when it comes to experiments like this one. Forbes reporter Kashmir Hill notes that “research” wasn’t even listed in the Facebook user agreement until four months after this particular emotion manipulation study was conducted. Read her article in Forbes first, then check out the redlined Facebook data policy changes for yourself.

Source: “The Facebook Emotional Manipulation Study: Sources,” James Grimmelmann, professor of law at the University of Maryland, June 30, 2014.
Why you should read this: If you’re ready to go even farther down the research rabbit hole, The Laboratorium is your next stop. Grimmelmann provides dozens of resources for reporters and privacy nerds alike, including links to previous Facebook studies on voter turnout, the social structure of networks — even the effect of rainfall on emotional content.