Facebook’s study may help their experience, but it also breaches their ethics
In case you missed it over the weekend, Facebook has admitted to conducting a psychological experiment of 689,000 of its users to assess how they reacted to manipulating their news feeds.
The study, in partnership with Cornell University and the University of Columbia, attempted to assess if “exposure to emotions led people to change their own posting behaviours.”
The experiment, which took place in 2012, has understandably caused distress and outrage, chiefly because of its cloak-and-dagger nature. Facebook ploughed ahead with the research without any informed consent – a pretty important requirement for any psychological study.
Yet for the social network, the users did sign up – by ticking the terms and conditions box when they registered for their account. Facebook’s user data study, last revised in November 2013 – here if you’ve got a spare hour or two – does mention data usage. But could you argue that this represents ‘informed consent’?
It’s worth asking here: do you even bother to read the terms and conditions when signing up for a service today? They all say the same thing, right?
Perhaps this will cause a sea-change. As the headline to a Guardian article reads: "This ought to be the final straw." "We should not tolerate it." Too much sitting on the fence from all concerned, not enough action.
Adam Kramer, a data scientist at Facebook, took to his own page to defend the study. “We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out,” he wrote. “At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook.”
Yet Kramer conceded the researchers didn’t clearly state their motivations in the paper.
“The goal of all of our research at Facebook is to learn how to provide a better service,” he wrote. “Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone.”
He added: “I can understand why some people have concerns about it, and my co-authors and I are very sorry for the way the paper described the research and any anxiety it caused.
“In hindsight, the research benefits of the paper may not have justified all of this anxiety.”
Some critical reactions to the news tried to look at the positive side, from this:
If you're upset about Facebook experimenting on people, you should just stop using the fucking Internet right now. Really, this is news?— Startup L. Jackson (@StartupLJackson) June 29, 2014
The fact remains, however, that an experiment took place which knowingly made people feel worse about themselves, regardless of whether it improves the overall user experience or not. Facebook should really name those who had been unwittingly involved in this to preserve some of their ethics – but they’ve created enough of a rod for their back as it is.