Did Facebook’s emotion experiment break the law? ICO probes

Did Facebook's emotion experiment break the law? ICO probes

Thumbs up in handcuffs, image courtesy of ShutterstockThe UK’s Information Commissioner’s Office (ICO) is investigating whether Facebook’s emotional manipulation study broke data protection laws.

The regulator told The Financial Times that it planned to probe the Facebook experiment, which manipulated the feeds of close to 700,000 users to determine how they reacted to positive or negative news, following widespread public outrage after a paper was published over the weekend.

A spokesperson for the ICO told the paper that it would liaise with the office of the Irish Data Protection Commissioner (Facebook has its European headquarters in Dublin), but it was “too early to tell exactly what part of the law Facebook may have infringed.”

The Information Commissioner’s Office will likely look at how much user information was used and whether those affected had given permission for their data to be used in such a manner.

Yesterday, it was also reported that Facebook added a “research” clause to it terms and conditions four months after the experiment began.

That clause states that the company could use user data for “internal operations, including troubleshooting, data analysis, testing, research and service improvement.”

However, a spokeswoman for Facebook told Forbes:

When someone signs up for Facebook, we've always asked permission to use their information to provide and enhance the services we offer. To suggest we conducted any corporate research without permission is complete fiction.

Companies that want to improve their services use the information their customers provide, whether or not their privacy policy uses the word 'research' or not.

Adam Kramer, a co-author of the research paper, offered up his own explanation for the study on his personal Facebook page:

The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product. We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook. We didn't clearly state our motivations in the paper.

Kramer said that lessons had been learned via an internal review process and offered an apology to anyone who may have become distressed over the way in which the research was conducted:

The goal of all of our research at Facebook is to learn how to provide a better service. Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone. I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety.

As we reported yesterday, Facebook said there was “no unnecessary collection of people’s data” and none of the gathered data could be associated with a specific account on the social network.

Facebook spokesman Matt Steinfeld said:

We want to do better in the future and are improving our process based on this feedback. The study was done with appropriate protections for people's information and we are happy to answer any questions regulators may have.

Image of thumbs up hands in cuffs courtesy of Shutterstock.