Did Facebook's emotion experiment break the law? ICO probes

Filed Under: Facebook, Featured, Law & order

Thumbs up in handcuffs, image courtesy of ShutterstockThe UK's Information Commissioner's Office (ICO) is investigating whether Facebook's emotional manipulation study broke data protection laws.

The regulator told The Financial Times that it planned to probe the Facebook experiment, which manipulated the feeds of close to 700,000 users to determine how they reacted to positive or negative news, following widespread public outrage after a paper was published over the weekend.

A spokesperson for the ICO told the paper that it would liaise with the office of the Irish Data Protection Commissioner (Facebook has its European headquarters in Dublin), but it was "too early to tell exactly what part of the law Facebook may have infringed."

The Information Commissioner's Office will likely look at how much user information was used and whether those affected had given permission for their data to be used in such a manner.

Yesterday, it was also reported that Facebook added a "research" clause to it terms and conditions four months after the experiment began.

That clause states that the company could use user data for "internal operations, including troubleshooting, data analysis, testing, research and service improvement."

However, a spokeswoman for Facebook told Forbes:

When someone signs up for Facebook, we've always asked permission to use their information to provide and enhance the services we offer. To suggest we conducted any corporate research without permission is complete fiction.

Companies that want to improve their services use the information their customers provide, whether or not their privacy policy uses the word 'research' or not.

Adam Kramer, a co-author of the research paper, offered up his own explanation for the study on his personal Facebook page:

The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product. We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook. We didn't clearly state our motivations in the paper.

Kramer said that lessons had been learned via an internal review process and offered an apology to anyone who may have become distressed over the way in which the research was conducted:

The goal of all of our research at Facebook is to learn how to provide a better service. Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone. I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety.

As we reported yesterday, Facebook said there was "no unnecessary collection of people's data" and none of the gathered data could be associated with a specific account on the social network.

Facebook spokesman Matt Steinfeld said:

We want to do better in the future and are improving our process based on this feedback. The study was done with appropriate protections for people's information and we are happy to answer any questions regulators may have.

Image of thumbs up hands in cuffs courtesy of Shutterstock.

, , , ,

You might like

14 Responses to Did Facebook's emotion experiment break the law? ICO probes

  1. Shiny · 20 days ago

    "It's a test designed to provoke an emotional response"

    Bladerunner may be great but it's a worrying trend when companies start turning dystopian science fiction in to science reality.

  2. DW · 19 days ago

    What if someone was already on a tipping point and seeing more negative posts due to this experiment pushed them over the edge? Mood/emotion is not something a social network gets to play with, that's reserved for movies, music, and other artwork.

  3. TonyG · 19 days ago

    Facebook's terms and conditions could be simplified from the tome they currently are to "we can do whatever we like with anything you post on Facebook; if you don't like it, tough - don't sign up"

  4. Bill · 19 days ago

    I'm at a loss to understand how this could possibly lead to providing a better service unless their intention was to ban certain types of posts like negative ones. It appears to be more of a test as to how far they can manipulate before it causes anger.
    Lets see how much we can get off with!

    • Zig Pope · 19 days ago

      It is their pr spin. They have a tornado on their unethical hands for this.

    • n0vice · 19 days ago

      Maybe they want to see what kind of content they can post to keep users on Facebook longer. If they can see what kind of messages 'addict' users perhaps they can use it to keep them hooked.

  5. Freida Gray · 19 days ago

    When I signed up for Facebook it was with the understanding that _all_of my friends' posts,negative as well as positive,would be seen.I didn't expect Facebook to decide which posts were seen.Since _nobody_in the "experiment" knew their posts were being manipulated before the experiment,&since the "research clause"wasn't added before the experiment',yes,Facebook broke their own Terms of Service that existed at that time.They could have monitored which types of post got the most likes/comments & maybe gotten better results that way.

    • Alan · 19 days ago

      They've been doing this for a while, with their "reach" program. Keeping posts from you and others, randomly selecting who sees what pages post. It has been a big issue for quite some time, especially when it comes to trying to advertise through a page. Hence the "more people can see this if you pay us" thing.

  6. Vito DiLuminoso · 19 days ago

    Considering the unending stream of revelations about Facebook's abusive exploitation of users, it's difficult to understand the current outrage, which is just more of the same thing Zuckerberg & Company have always done.

    I don't condone their manipulative "research"...but then, I don't condone their relentless torrent of opt-out "feature" changes either, which is precisely why I terminated my account several years ago.

    Facebook users have the power to end the abuse. All they have to do is stop using Facebook. That's unlikely to happen, and Facebook will continue to milk them. They have no one to blame but themselves.

  7. Richard W. Born · 19 days ago

    This "experiment" reminds me of that "bridge scandal" with lane closures for Port Lee. As soon as they have to explain in detail what it was for, it will turn out to be nothing but embarrassing.

  8. Even if found not responsible, they seem to have violated what people thought they had. If some of the good things were not being received by me or others, then it's censorship (or manipulation), even if for research or 'to make a better facebook".

    I don't think they violated their law and the premise that they can do what they want to make a better business model will always be questioned. I'm afraid that if you don't want this, then don't use Facebook, I don't pay any attention to them and my data is already skewed!

    Jack

  9. Reader · 19 days ago

    Did Facebook's emotion experiment break the law? Possibly. Might depend upon which country you're talking about.

    From Firedoglake:

    Given that the research was published and received federal funding it may have broke laws concerning informed consent. The government in the UK is reportedly looking into whether the experiment violated UK data-protection laws.

    and

    The highly controversial and possibly illegal Facebook experiment on mood manipulation is reportedly connected to the Department of Defense’s Minerva Initiative. The Minerva Initiative tries to model tipping points for social unrest and is funded directly by the Pentagon as well as indirectly through the National Science Foundation. The project is supposed to help improve relations between the Department of Defense as well as help military planning.

    One of the researchers involved with the Facebook experiment was Jeffrey Hancock of Cornell University who is funded through the Minvera Initiative to study the contagion of ideas ...

    Source: Facebook Mood Manipulation Experiment Connected To Department Of Defense, by DS Wright, Thursday, July 3, 2014.

    • Paul Ducklin · 18 days ago

      Pesky DoD, getting involved in network research! DARPA should keep away from computer stuff...it might come up with the idea for something really suspicious...like that sinister ARPANET idea it played around with in the 1960s :-)

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

About the author

Lee Munson is the founder of Security FAQs, a social media manager with BH Consulting and a blogger with a huge passion for information security.