Sorry!! The article you are trying to read is not available now.
Thank you very much;
you're only a step away from
downloading your reports.

Facebook's Secret Experiment: Now Investors Can Study the 'Emotional Contagion' of Consumer Backlash

By

The company did not seek consent before manipulating the emotions of hundreds of thousands of users.

PrintPRINT
Did you volunteer to be a guinea pig for the Facebook (NASDAQ:FB) research study designed to determine whether negative posts on your News Feed can make your spirits plummet, and cause you to post negative comments to all of your friends, possibly getting them bummed out as well?
 
Me neither.
 
And yet, we may have been among 700,000 Facebook users who unwittingly participated in a scientific experiment in "emotional contagion."
 
The News Feeds of these 700,000 randomly-selected Facebook users were manipulated in order to prioritize the display of posts that used negative, positive or neutral words. Facebook-affiliated researchers at Cornell University and the University of California at San Francisco then studied the posts subsequently made by the test subjects to see whether their mood was affected, and for how long.
 
The study results indicate that, yes, Facebook can yank your chain remotely, any time it wants.
 
This happened for one week back in 2012, but only became known publicly when the article appeared online in an article titled "Experimental evidence of massive-scale emotional contagion through social networks" in the journal Proceedings of the National Academy of Sciences.
 
The article concludes that, even in the absence of face-to-face contact, "emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness."
 
Slate.com and the website of The Atlantic broke the story over the weekend.
 
So, Facebook seems to have found a new use for the algorithm it uses to show viewers the content they may find most interesting and engaging.
 
The incident is particularly striking because Facebook did not just collect and hand over a batch of data it had collected on its users. It does that all the time, as do many other web companies. It secretly changed the service it provides to users in order to test their responses to it.
 
As the story blew up over the weekend, Facebook management apparently couldn't understand what all the fuss was about. Its blog insists that the study was just part of its ongoing mission of "understanding how people respond to different types of content, whether it's positive or negative in tone, news from friends, or information from pages they follow."
 
And, anyway, the company posting said, the study was vetted in its "strong internal review process."

Great. So, maybe Facebook can start programming its News Feed like Muzak, moving from frisky beats to soothing melodies depending on how the company judges we should be feeling at this hour.
 
Some of the baby geniuses at Facebook seem to have grasped that they have truly stepped in it this time. One of the paper's authors posted an apology of sorts on his Facebook page, in which he said that the paper should have made it clear that the company only did the research because "we care about the emotional impact of Facebook and the people who use our product."
 
Yeah. They care so much that, as Slate.com notes, "Facebook intentionally made thousands upon thousands of people sad." And, it did so without requesting or receiving "informed consent" from its test subjects, a standard requirement for social scientists who want to go messing with people's heads.
 
Facebook asserts that any permission it needed is covered by the language of its standard terms of agreement, in which users agree to the use of their data for analysis, testing and research.
 
So, does this kind of publicity hurt Facebook at all long-term? It depends on how people around the world react to headlines like this one, from the British newspaper The Register: "Facebook let shrinks MESS WITH YOUR HEAD, sans permission."
 
In the meantime, perhaps their ace researchers would like to analyze another incident of emotional contagion: the fury of Facebook users.  Test samples can be found on Twitter (NYSE:TWTR) as #FacebookExperiment. 

No positions in stocks mentioned.
The information on this website solely reflects the analysis of or opinion about the performance of securities and financial markets by the writers whose articles appear on the site. The views expressed by the writers are not necessarily the views of Minyanville Media, Inc. or members of its management. Nothing contained on the website is intended to constitute a recommendation or advice addressed to an individual investor or category of investors to purchase, sell or hold any security, or to take any action with respect to the prospective movement of the securities markets or to solicit the purchase or sale of any security. Any investment decisions must be made by the reader either individually or in consultation with his or her investment professional. Minyanville writers and staff may trade or hold positions in securities that are discussed in articles appearing on the website. Writers of articles are required to disclose whether they have a position in any stock or fund discussed in an article, but are not permitted to disclose the size or direction of the position. Nothing on this website is intended to solicit business of any kind for a writer's business or fund. Minyanville management and staff as well as contributing writers will not respond to emails or other communications requesting investment advice.

Copyright 2011 Minyanville Media, Inc. All Rights Reserved.
PrintPRINT
 
Featured Videos

WHAT'S POPULAR IN THE VILLE