Can Facebook updates predict suicide?

Filed Under: Facebook, Featured, Mobile, Privacy, Social networks, Twitter

FacebookWe already know that Facebook's data-mining capabilities are, for better and for worse, extremely powerful.

On the dark side of Facebook data mining, we've seen how the network has used uploaded contacts to create shadow profiles of both users and non-users alike.

On the positive side of its data-mining abilities, we've also seen how Facebook can automatically analyze relationships and chat content to catch would-be child molesters.

Now, Facebook is joining with researchers to put its data mining to use for good in a new way: to figure out whether it's possible to discern suicidal thoughts by sifting through the social media streams and risk factors of volunteers.

The project will collect mobile data, including user location, and anonymized text-messaging content, as well as social networking profiles such as those on Facebook, Twitter and LinkedIn.

Status update

The information will be stored in the Geisel School of Medicine at Dartmouth's onsite database. Sharing of the information will be strictly forbidden by the study's medical protocol and will be safeguarded by HIPAA standards of medical privacy.

One of the study's co-investigators and instructor at the Geisel School of Medicine at Dartmouth, Paul Thompson, says that the team has created a secure data-storage environment behind the medical school's IT firewall to ensure participant privacy.

The volunteers who will upload the data are military veterans - a group with a "disturbingly" high suicide rate, as the researchers at The Durkheim Project noted when they announced the project on Tuesday.

In February, the Department of Veterans Affairs in the US released a study [summary PDF] that found daily suicide rates of between 18 to 22 veterans.

The Durkheim Project is named after French sociologist Émile Durkheim, who wrote "Le Suicide" - a seminal work in the field of sociology.

Durkheim divided suicides into distinct types that correlated with factors such as whether the victims were male (more likely), married (less likely) or childless (more likely).

The Durkheim Project team is basing their work on a prior investigation led by one of its team members that showed their text-mining methods were statistically significant in predicting suicidality - i.e., suicidal ideation and suicide-related behaviors, including completed suicide - with correlations of 65% or more in prediction accuracy.

The project will rely on applications on Facebook, iPhones and Android devices that will feed content from the online activity of its volunteer veterans into an integrated medical database.

That data will be analyzed by artificial intelligence systems and will rely on predictive analytics apps to provide real-time monitoring of text content and behavioral patterns statistically correlated with tendencies for harmful behaviors, including suicide.

This early stage won't include intervention. The researchers aren't empowered to intervene if suicide or self-harm is flagged.

People at computers. Image courtesy of Shutterstock.In fact, during this initial version - the beta - collected data will only be seen by volunteers and the automated statistical engine, which is focused only on refining future algorithms for risk-factor analysis, the team says.

As far as privacy goes, those who use the mobile app can opt in or out of any or all social networks that the study will be monitoring.

Chris Poulin, the Durkheim Project's Principal Investigator, said that that's the only way the project can gain its participants' trust:

"Ensuring data security and confidentiality is essential for building and maintaining trust with our study participants."

At this early stage, the researchers are hoping that the text mining project will help clinicians to better understand mental health risk factors for suicide by detecting subtle changes in a population's aggregate psychological risk indicators that might otherwise be undetectable.

Poulin told Ars Technica's Casey Johnston that the researchers haven't yet developed a way to mesh the data they collect with actual instances of suicide:

"We still need to get extensions to our authorized medical protocol to do this. … How do you ask for this 'date of death' information in such a way as to not be insensitive?"

The team intends to develop consent forms that will enable people in participants' networks to report a death.

The project's database will also be cross-referencing social media data with suicide risk factors, including concussions, post-traumatic stress, number of military deployments served, and family stresses, any of which may contribute to what the team calls the disturbingly high suicide rates among active-duty military personnel and veterans.

Facebook's Joel Kaplan, US VP of Public Policy and himself a veteran, said the company is hoping this will help to create tools to better keep its users safe:

"At Facebook, we have a unique opportunity to provide the right resources to our users in distress, when and where they need them most. We are proud to be partnering with the Department of Veterans Affairs research on the Durkheim Project, so we can bring a better understanding to this important issue and equip those that use our service with even better tools to keep them safe. Through a concerted and coordinated effort on the part of private industry, government, and concerned family and friends, we believe we can make a real difference in preventing suicide and saving lives."

The Durkheim Project is welcoming inquiries. Find out more at

Let's hope that those on The Durkheim Project do bolt this data down as strongly as its volunteers deserve.

The volunteers have served their country already. Now, they're stepping forward to do so in a unique effort that could help potential suicide victims.

Thank you all, both researchers and volunteers, both veterans and Facebook data-mining engineers.

Image of people at computers courtesy of Shutterstock.

, , , ,

You might like

10 Responses to Can Facebook updates predict suicide?

  1. skakk4 · 830 days ago

    I don't understand why Naked Security have to make facebook look bad. It's almost personel. Hate. But why. Because someone had a bad experience with facebook years ago. I know once I was glad because I found Naked Security and I read all articles about facebook and followed the drift. I was happy. But then again after reading every day about facebook , scams, security, privecy, malware and so on. My mood changed from happy to someone who disliked to be online. Always looking on the dark side of the internet. And from reading everything about facebook I also enden up with something called Mask Me and DoNotTrackMe. I got rid of my profile picture and used a picture instead. Max privecy. I suddently found my self in the position where I didn't know my own password. I looked at my facebook profile picture and said no. This is not me. Naked Security wasn't a friend who would help you to get wiser. I spend a lot of time on my computer to get everything back to normal and gain the optimism and happyness to be glad on line again. I don't why I bother write this because nobody is going to read it and if. What is the point. Maybe it's just me who want to tell my experience . If not to others then to my self. Maybe this is meant to comment something else and then.............STOP I rest my case and wish everybody a nice day online.

    • Nigel · 830 days ago

      Let me make sure I've got this straight now. You're saying NakedSecurity is the problem and Facebook is the solution? You're saying Facebook is "a friend who would help you get wiser"?


      Well, I guess it's sort of the same sense that thieves and other knaves will "help" you learn (the hard way) that you should draw your curtains and lock your door at night.

      Nevertheless, anyone who's truly your friend would advise you to check that Kool-Aid you've been guzzling. Then ask yourself, "Who is part of the problem", and go take a good long look in the mirror.

  2. This suicidal situation is not just American but has been found in British troops.
    Considering the stress troops go through when in a time of war it is not surprising that someone becomes suicidal. As when released into civi street any noise can trigger a recall in memory of what has passed, leading to actions that appear abnormal. This is the price of war and when politicians realise this, then may be wars can become a thing of the past....

  3. Randy · 830 days ago

    Awwww, isn't that nice. Some "researchers" have found a way to legitimatize the collection of citizen's most private information from various electronic medium and run the data through AI/filtering software in order to predict the individual's probable behavior.
    It's not enough that the government knows what it's citizens are doing now and in the past, now they want to know what we will probably do before we do it. Suicide prevention is a very innocuous way to make this kind of data collection palatable to the public but the software they use to analyze the data can also be used to predict things other than suicide.

    • Gary · 830 days ago

      'Minority Report' ring a bell with these sentiments? I was thinking the same thing as I read this article. How much of a leap is it from analyzing suicide probabilities to analyzing anything else about everyone that is live on the web?

      I have been told I use too many movie quotes, but it was put very succinctly in 'Jurassic Park': "...but your scientists were so preoccupied with whether or not they could that they didn't stop to think if they should. " We have to start thinking more whether we SHOULD do some of the things we do rather than CAN WE do them.

  4. Jack Wilborn · 830 days ago

    I find this, a little out of bounds. First, you want some person (who?) to evaluate data and make a judgment call, maybe for road rage! This may be fine for our (mistreated lowly) military, but I sure don't want some machine making a determination if I'm going to off myself! I can understand the study, but I can't see spending the money for something that, even if 100% accurate, would not be allowed to be acted upon a complete waste.

    I have problems with people that don't want baseball bats to be sold deciding if I can have one or not (applies to other items). This is really the limits of what we need to be doing. I'm sure someone will say if it saves one person... Well this may very well do more damage than good.

  5. Lisa Vaas · 830 days ago

    This is not solely academic. Researchers aren't allowed to act on the data analysis at this beta stage of the study, true. They are hoping to eventually use their findings to present clinicians with tools to use for intervention, in whatever form that may take.

  6. Steven · 829 days ago

    What if I have neices and nephews, or relatives and there kids, or friends kids then inviite to include them ?

    2 times Facebook has refused to let me enter, unless I explain why I pick my friends,

    If I picked several possible friends, and I forget who they are, they expect me to remove them if they did not accept - Before I can enter the room.

    Another thing Facebook does, suggest friends of any age for me - That I should or should not invite - when I am not interested at the moment.

  7. david · 829 days ago

    This seems to be written by a person who isn't in the security field, but an average journalist. There are so many wrong claims written in this article.

  8. skeptic · 823 days ago

    Isn't there a greater underlying problem if Facebook becomes the sole outlet or outlet of choice for someone on the brink of suicide? Having been at that brink myself before the age of social media, I can say in hindsight that what saved me was the tangible network of people who were physically present to offer moral and emotional support. By the time Facebook (or other social media) has become the forum for parting words, it is already too late... the social system has already failed them.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

About the author

I've been writing about technology, careers, science and health since 1995. I rose to the lofty heights of Executive Editor for eWEEK, popped out with the 2008 crash, joined the freelancer economy, and am still writing for my beloved peeps at places like Sophos's Naked Security, CIO Mag, ComputerWorld, PC Mag, IT Expert Voice, Software Quality Connection, Time, and the US and British editions of HP's Input/Output. I respond to cash and spicy sites, so don't be shy.