We already know that Facebook’s data-mining capabilities are, for better and for worse, extremely powerful.
On the dark side of Facebook data mining, we’ve seen how the network has used uploaded contacts to create shadow profiles of both users and non-users alike.
On the positive side of its data-mining abilities, we’ve also seen how Facebook can automatically analyze relationships and chat content to catch would-be child molesters.
Now, Facebook is joining with researchers to put its data mining to use for good in a new way: to figure out whether it’s possible to discern suicidal thoughts by sifting through the social media streams and risk factors of volunteers.
The project will collect mobile data, including user location, and anonymized text-messaging content, as well as social networking profiles such as those on Facebook, Twitter and LinkedIn.
The information will be stored in the Geisel School of Medicine at Dartmouth’s onsite database. Sharing of the information will be strictly forbidden by the study’s medical protocol and will be safeguarded by HIPAA standards of medical privacy.
One of the study’s co-investigators and instructor at the Geisel School of Medicine at Dartmouth, Paul Thompson, says that the team has created a secure data-storage environment behind the medical school’s IT firewall to ensure participant privacy.
The volunteers who will upload the data are military veterans – a group with a “disturbingly” high suicide rate, as the researchers at The Durkheim Project noted when they announced the project on Tuesday.
In February, the Department of Veterans Affairs in the US released a study [summary PDF] that found daily suicide rates of between 18 to 22 veterans.
The Durkheim Project is named after French sociologist Émile Durkheim, who wrote “Le Suicide” – a seminal work in the field of sociology.
Durkheim divided suicides into distinct types that correlated with factors such as whether the victims were male (more likely), married (less likely) or childless (more likely).
The Durkheim Project team is basing their work on a prior investigation led by one of its team members that showed their text-mining methods were statistically significant in predicting suicidality – i.e., suicidal ideation and suicide-related behaviors, including completed suicide – with correlations of 65% or more in prediction accuracy.
The project will rely on applications on Facebook, iPhones and Android devices that will feed content from the online activity of its volunteer veterans into an integrated medical database.
That data will be analyzed by artificial intelligence systems and will rely on predictive analytics apps to provide real-time monitoring of text content and behavioral patterns statistically correlated with tendencies for harmful behaviors, including suicide.
This early stage won’t include intervention. The researchers aren’t empowered to intervene if suicide or self-harm is flagged.
In fact, during this initial version – the beta – collected data will only be seen by volunteers and the automated statistical engine, which is focused only on refining future algorithms for risk-factor analysis, the team says.
As far as privacy goes, those who use the mobile app can opt in or out of any or all social networks that the study will be monitoring.
Chris Poulin, the Durkheim Project’s Principal Investigator, said that that’s the only way the project can gain its participants’ trust:
"Ensuring data security and confidentiality is essential for building and maintaining trust with our study participants."
At this early stage, the researchers are hoping that the text mining project will help clinicians to better understand mental health risk factors for suicide by detecting subtle changes in a population’s aggregate psychological risk indicators that might otherwise be undetectable.
Poulin told Ars Technica’s Casey Johnston that the researchers haven’t yet developed a way to mesh the data they collect with actual instances of suicide:
"We still need to get extensions to our authorized medical protocol to do this. … How do you ask for this 'date of death' information in such a way as to not be insensitive?"
The team intends to develop consent forms that will enable people in participants’ networks to report a death.
The project’s database will also be cross-referencing social media data with suicide risk factors, including concussions, post-traumatic stress, number of military deployments served, and family stresses, any of which may contribute to what the team calls the disturbingly high suicide rates among active-duty military personnel and veterans.
Facebook’s Joel Kaplan, US VP of Public Policy and himself a veteran, said the company is hoping this will help to create tools to better keep its users safe:
"At Facebook, we have a unique opportunity to provide the right resources to our users in distress, when and where they need them most. We are proud to be partnering with the Department of Veterans Affairs research on the Durkheim Project, so we can bring a better understanding to this important issue and equip those that use our service with even better tools to keep them safe. Through a concerted and coordinated effort on the part of private industry, government, and concerned family and friends, we believe we can make a real difference in preventing suicide and saving lives."
The Durkheim Project is welcoming inquiries. Find out more at www.durkheimproject.org.
Let’s hope that those on The Durkheim Project do bolt this data down as strongly as its volunteers deserve.
The volunteers have served their country already. Now, they’re stepping forward to do so in a unique effort that could help potential suicide victims.
Thank you all, both researchers and volunteers, both veterans and Facebook data-mining engineers.