Saturday, June 28, 2014

Facebook Data Scientists Manipulated News Feed To Perform A Psychology Experiment On 600,000 Users

More proof that Facebook isn’t free.



Albert Gea / Reuters / Reuters


Aside from being the world's largest social network, Facebook is also a sociologist's dream. With 1.28 billion worldwide active users, the social network has created the most formidable data set ever seen for studying human behavior.


Not one to let your data go to waste, the company employs a team of data scientists to conduct experiments with user data and behavior, as it did in a recent study, first reported by NewScientist.


According to the study, Facebook manipulated the News Feeds of 689,003 users to study whether online emotions can be contagious. For a week, some users were shown posts in News Feed containing a higher number of positive words, others were shown posts with more negative sentiments. From the study:



When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks



This sort of social engineering is nothing new for Facebook, which has been using user data to conduct scientific studies for years now. Back in 2012, MIT Technology Review reported that Mark Zuckerberg himself was using the social network's influence to conduct personal experiments:



Influenced in part by conversations over dinner with his med-student girlfriend (now his wife), Zuckerberg decided that he should use social influence within Facebook to increase organ donor registrations. Users were given an opportunity to click a box on their Timeline pages to signal that they were registered donors, which triggered a notification to their friends. The new feature started a cascade of social pressure, and organ donor enrollment increased by a factor of 23 across 44 states.



Given that Facebook has spent the better part of 2014 making very public gestures to assure users it will protect their privacy, experiments like this one, which treat unwitting users and their data as test subjects, threaten to damage the social network's already shaky privacy reputation. And while Facebook assures the experiments are all designed to gain insights that will ultimately better users' experiences on the network, the study, which openly admits it emotionally manipulated its users has already outraged privacy advocates and casual visitors, alike.


While nobody likes being emotionally manipulated, part of the outrage seems to be due to the fact that Facebook is technically in the right, here. When you sign up for Facebook, you are, in fact, offering up your consent to have your data and profiles used in these kinds of experiments. And as the study notes, since Facebook's data team used machine analysis to surface the positive and negative posts, it didn't breach Facebook's privacy policy.


Though this sort of thing may be nothing new, it's an important reminder that just because you don't have to pay to use Facebook, doesn't mean admission to the social network is free.




View Entire List ›


No comments:

Post a Comment