Facebook says there are a few things about its experiment on users’ emotional states that it “should have done differently.”
Like maybe receive informed consent from people before you modulate their newsfeeds so as to show them sadder/madder/gladder content in your efforts to determine if emotional states are contagious?
Well, no, not exactly.
In a blog post on Thursday, Facebook Chief Technology Officer Mike Schroepfer said that the crew was “unprepared” for the ruckus stirred up at the end of June about its emotional contagion research, that Facebook has taken the comments and criticism to heart, and that aspects of the research could have/should have been tweaked:
For example, we should have considered other non-experimental ways to do this research. The research would also have benefited from more extensive review by a wider and more senior group of people. Last, in releasing the study, we failed to communicate clearly why and how we did it.
For example, instead of controlling users’ expression of emotion by keeping positive or negative items out of newsfeeds, as Facebook did in the 2011 experiment that upset users and ethicists, it could have used a methodology presented in another recent research article to which Schroepfer linked.
In that article, the researchers describe a hands-off approach to studying emotional synchrony as it related to gloomy weather:
Instead of changing the user's emotion directly with an experimental treatment, we let rainfall do the work for us by measuring how much the rain-induced change in a user's expression predicts changes in the user's friends' expression.
Following the brouhaha, Facebook said it plans to change research methodology in these ways:
- Guidelines: Researchers are to be given clearer guidelines, and future studies dealing with “deeply personal” topics, such as looking at particular groups or people of a certain age, will go through an “enhanced review process” before research can begin.
- Review: Facebook’s created yet another cross-function panel of people to weigh in on research, including senior researchers, engineers, lawyers, and the company’s privacy and policy teams.
- Training: Facebook has added research education to its six-week training bootcamp for new engineers, as well as adding it to the annual privacy and security training all Facebook staffers go through.
- Research website: Facebook’s academic research is now available in one location, here.
Was Facebook’s “we’ll do things differently” post an apology?
I hesitate to call it such, given that it didn’t address the aspect of the research that ethicists found most problematic: i.e., that Facebook’s experiment on nearly 1 million users’ newsfeeds was done without asking those people if they wanted to be part of the study.
But it did acknowledge one thing that was sorely lacking in the 2012 study: transparency.
...we failed to communicate clearly why and how we did it.
However, not telling people why and how you did the research (after it happens) is not the same thing as asking for their informed consent.
What do you think: does Facebook’s apology work for you?
Composite image of lab equipment and sorry note courtesy of Shutterstock.
I am under the impression they are referring to ex-post facto communication re their ‘study’ ..this could be an apology, yes, intended for their PR department. Definitely not intended for the user.
It’s okay. I sort of stopped using facebook. I have no use for a service that pulls this crap with it’s users.
“No use for” but you only “sort of stopped” 🙂
I no longer update a status, I don’t post or share photos. I don’t comment on or like posts by others, nor do I read posts by others. The only reason my account hasn’t been deleted is there are still some websites, tech sites too, that only allow commenting via the facebook plug in – and I still have my troll accounts for those. So yeah. I sort of stopped using it. There is nothing meaningful or truthful in any account I have control of. I’m one of those problem facebook users with a dozen account, adblock plus, and won’t use the mobile app.
“sort of stopped” might mean I don’t use it unless I am forced to if I wish to communicate with some organisation.
Some BBC programs only give facebook or twitter tags as means of communicating with them.
Some companies only allow you to apply for a vacancy via linkedin – I suspect some insist on you doing so via facebook. That way you can get your computer (in theory) to sift out those who enjoy drink holidays and anything that might be illicit. They probably also think they can do some form of ranking about cultural fit from a facebook page!
I am retired so not only do I have “no use for” but I have never used the wretched network – but I resent its all pervasiveness.
Why do they need to do the experiments anyway? All they are doing is making excuses as to how they did it wrong. They will find another way to do their “research”.
I am not a lab rat. I use FaceBook and I do not consent to mind control experiments. A vaguely worded user agreement is not an informed consent. Without consent it is not research it is a mind control deliverable for someone’s propaganda campaign.
This is the problem with Mega-Corporations like Facebook and Google, they will do what they want and ask forgiveness later.
They live by the motto “it’s better to ask forgiveness than permission” – Grace Harper
Informed consent and use of an IRB are not optional for this sort of research; they are federally mandated. This half-assed apology is not nearly enough. There should be federal charges filed and compensation for people who were manipulated without their consent.
It seems there’s a really simple way to solve FB’s root problem (the “we need human lab rats” “problem”): Offer people an option to join in blind studies.
In other words, people sign up, and this allows FaceBook to secretly test them the way the researchers did. They don’t know WHEN it happens, but they agreed to it.
Naturally, users will need some kind of incentive to join such a program. Perhaps “for 1 month of testing, we’ll give you 6 months ad-free FB usage” or something like that. Or free usage of some paid content for a time.
It seems that Facebook has forgotten or never heard the term “mob mentality’. Of course, emotional states are contagious. After all,that is the entire basis of “mob mentality”.
Why do they violate peoples privacy to prove something that we as a species have known for 1000s of years. take for example writing, when you read a sad biography your emotions are being influenced in the exact same way.