Social Media posts – Why scientists think your social media posts can help prevent suicide
Take a moment to look at your emoji keyboard.Scroll through the angry face, ghost, stiletto, doughnut, flashlight and cigarette until you reach the hearts.
There it is: love. Amid the mundane and humorous, those vibrant, colorful little shapes can easily become a rapid-fire display of affection to a friend, parent or partner. But notice, too, the broken and blue hearts, and their restrained reminders of sadness, loneliness or grief.
It turns out that these little characters are more important than we could imagine. Linguists, psychologists and computer scientists are discovering that what we collectively share on social media, and when, can signal information about our mental health. Some of these researchers believe machine learning, algorithms and mathematical analysis can give health care providers tools to help solve one of our most intractable public health epidemics: suicide.
In an underfunded field where saving every at-risk life can feel like an impossible goal, this technology promises to give public health experts and clinicians a novel tool to predict suicide risk. Given the rising rate of suicide in the United States, that kind of prevention can’t come quickly enough.
The clues in your tweets
Predicting suicide risk is exactly what Glen Coppersmith, a data scientist and psychologist, has set out to do. That mission, he says, is an urgent one.
Data released in April from the Centers for Disease Control and Prevention revealed an alarming rise in suicides among both genders and in every age group between 10 and 74. Between 1999 and 2014, the suicide rate in the U.S. increased 24%, to 13 per 100,000 people.
SEE ALSO: Suicides spike for 10 to 14-year-olds
As founder and CEO of the startup mental health analytics company Qntfy, Coppersmith is using machine learning to design algorithms that identify trends in human communication. That concept is well-established science, and it’s one you rely on every time youmisspella word and know autocorrect will catch (or worsen) the mistake.
Coppersmith belongs to a small group of researchers who believe this technology can be put in the service of detecting a person’s risk of mental illness and suicidal behavior. In his most recent study, published this month at an annualNorth American Association for Computational Linguistics meeting, he and his co-authors estimated the emotional content of tweets from hundreds of users who had talked openly about a suicide attempt, and tweets from a control group that did not display suicidal thoughts or feelings.
“Each word you use shows a tiny fraction of a bit of a clue in terms of who you are.”
Using a model designed to predict suicide risk in the former group, the researchers discovered their algorithm worked and picked up on surprising patterns, including one on how those emoji hearts can say more than we’d ever casually expect.
While nearly everyone in their sample included emoji in their tweets, Coppersmith and his co-authors noticed that some in the group that talked about attempting suicide employed a narrow range of emoji representing sadness more frequently, compared to typical users of the same age and gender. They often preferred the blue and broken hearts, but using those characters alone, says Coppersmith, doesn’t indicate risk of suicide.
The researchers also noticed an increase in messages conveying sadness prior to a suicide attempt, and then a rise in both sad and angry tweets around an attempt.
While Coppersmith’s study doesn’t include real tweets in order to protect users’ privacy, it does offer examples to demonstrate both anger (“I’m only good for being a verbal punching bag”) and sadness (“I’m totally pathetic even the scars from my attempts are pathetic”).
“Its not like … if you say this phrase you’re clearly in trouble,” Coppersmith says of the algorithm. “It says, ‘Let me look at all of the language.’ Each word you use shows a tiny fraction of a bit of a clue in terms of who you are, what you’re thinking, whether or not you’re in some sort of emotional crisis.”
Coppersmith believes gleaning such information from social media, as well as other data generated by mobile devices, could be vital to a psychologist looking for subtle hints about a patient’s risk of suicide.
Janet Schnell, who lost her brother to suicide in 1995, has worked in the prevention field for two decades and believes this new approach may succeed in pinpointing risk where other efforts have failed.
“I was so grateful as a loss survivor that there was a new window that could be opened.”
“Part of the grieving process is that loss survivors go back, reexamine, and look and look to see what they missed,” says Schnell, who is the loss division chair at the American Association of Suicidology.
Though her brother passed away before the advent of social media, Schnell knows how powerful it would be if researchers could use that data to help explain what often feels like a tragic mystery.
“I was so grateful as a loss survivor that there was a new window that could be opened,” she says.
Despite recent advances in this research, it will still take partnership between data scientists and health care professionals, funding, and a few years at minimum to produce research reliable enough so that psychologists could use it in their practice.
For Coppersmith’s research to reach that point, he needs many more insights like the ones he’s already produced, and deeper analysis of those conclusions.
That’s whyhe’s trying to collect data from at least tens of thousands of people, and he’s hoping you’ll consider donating yours.
Donating your data
In April, Qntfy launched a study calledOurDataHelps, which asks people to volunteer social media data from Twitter, Facebook, Instagram, Reddit and Tumblr. Coppersmith is also asking for Fitbit, Jawbone and Runkeeper data, which should provide insight about people’s physical patterns and habits.
Volunteers answer a few questions about their mental health and suicide history; the study invites both people who have and have not experienced mental illness to participate. Theygive Qntfy permission to anonymize and track their posts through open authentication, a common tool that allows third parties to view your account without requiring a password. (If you’ve ever synced an app like Candy Crush with your Facebook account, you’ve used open authentication.)
While the goal is to produce data and research that lead to new treatments for mental health conditions, Coppersmith wants to ultimately save lives by understanding how a person behaves in the days, weeks and months before becoming suicidal, and intervene before they have a chance to attempt suicide. Social media and mobile data provide a new opportunity to make that a reality.
April Foreman, suicide prevention coordinator at the Southeast Louisiana Veterans Health Care System in Baton Rouge, Louisiana, has been involved with OurDataHelps from its inception as an adviser. Describing the project to suicide survivors, she says, prompts a heartbreaking response.
“The look on their faces it’s that moment [when a person recognizes] that out of great tragedy you can salvage something,” she says. “There’s some precious thing left that makes a difference; a small good that can be pulled out of this horrible tragedy.”
That tragedy is often traced back torisk factorslike a family history of suicide, alcohol or drug addiction, and feelings of hopelessness. But patients with similar life stories routinely have different outcomes, and researchers have long struggled to understand the life-saving difference.
Studies conducted in the past few decades have generated important findings, but experts in the field say that prevention information is frequently vague, outdated or based on ineffective research methods.
Scientists studying suicide may follow the same variables in people’s lives over a long period of time. They’ve even analyzed personal journals, tallying how many times certain words appear in order to understand people’s minds in the fog of depression and suicidal thinking.What they haven’t been able to do, however, is perform that analysis on a large scale and quickly gather information about people’s everyday experiences.
The beauty of the algorithms behind OurDataHelps, Foreman says, is that they can do what she can’t: count.
Foreman regularly looks at the medical records of 80 to 120 high-risk veterans, most of whom won’t die by suicide. With training and expertise, she knows how to identify patients with serious warning signs, but she also understands the limitations of the human brain.
It can’t track and analyze, for instance, speech patterns in real time, show a statistically significant pattern and link that with patients more likely to attempt suicide.
Building the right systems
Kristy Hollingshead,who has co-authored studies with Coppersmith that detect and analyze post-traumatic stress, depression and ADHD on social media, says online communication contains a “very strong, consistent, dependable signal” about a person’s mental health.
Machine learning technology allows Hollingshead, a research scientist at the Florida Institute for Human & Machine Cognition, to capture and analyze thousands of public posts from at least several hundred users. In aggregate, those posts can easily become hundreds of thousands of lines of text, which she couldn’t possibly scroll through and code manually.
Despite the efficiency of outsourcing these tasks to a computer, the systems Hollingshead and her colleagues are building aren’t as accurate as they’d like them to be. They’re making educated guesses about their data estimating, for example, user gender and age. Using samplesfrom small groups of users can also introduce bias.
“Language is such a good signal of whats going on in your brain.”
Hollingshead believes that OurDataHelps can create a gold standard byoffering scientists a comprehensive and tested dataset to inform their models and algorithms.
With that kind of accuracy, Hollingshead envisions a world in which mental well-being becomes part of the quantified self; someone who has suicidal feelings, or has attempted suicide in the past, could use software or an app to run an algorithm on their social media feeds for imminent risk that they might not even sense.
“Language is such a good signal of whats going on in your brain,” she says. “Even if you tried to change it so you seemed happy, the algorithms can still pick up on the underlying currents of whats going on.”
The path to suicide is not straight
The Department of Defense recently released three studies that show just how promising statistical analysis can be when appliedunconventionallyto the problem of predicting suicide.
The research, conducted byCraig Bryan, a prominent suicide-prevention researcher,executive director of the National Center for Veterans Studies at The University of Utahand co-investigator on the OurDataHelps initiative, used a mathematical concept known as dynamic systems modeling.
Instead of looking at the classic single data point when a person died by suicide the researchers were able to incorporate real-time mood and language prior to their subjects’ deaths by scouring publicly available social media posts.
The work began in 2013 whenBryan and his co-authors set out to understand whether a service member’s social media data held clues about their suicide risk. In particular, the researchers analyzed Facebookposts from a sample of 700 service members who died by suicide and 700 service members who died from other causes during a two-year-period.
They not only confirmed their original hypothesis, but retrospectively predicted a person’s suicide and discovered patterns that challenge our assumptions about how suicide unfolds.
Bryan’s research reveals that the pathway tosuicide is not linear, contrary to the popular belief that such thoughts and behavior become an unyielding downward spiral. Instead, his study found important fluctuations in each service member’s path to suicide.
“What we did in the beginning was look at it like a straight line.”
Those who ended their lives were more likely to post about a stressful life event and then share negative emotions and beliefs; the sequence of those posts was reversed for those in the second group.
Prior to three months before the subjects’ deaths by suicide, they tended to post about physical symptoms, emotions and thoughts in close proximity. At the three-month mark, that trend disappeared.
Then, in the month before dying, they often posted about “maladaptive behavior,” like socialwithdrawaland alcohol use, while updates describing their worldview and self-perceptions became more stable.
“What we did in the beginning was look at it like a straight line,” Bryan says. “But thinking in ups and downs, like a roller coaster, with much more complex interactions that significantly improves our ability to detect suicidal individuals, and it provides a possible way to estimate, when are they going to die?”
Skepticism and hope
Sheila Hamilton’s husband died by suicide 10 years ago, and she hopes research like Bryan’s succeeds. And yet she remains skeptical that data will deliver us from this epidemic.
Hamilton, an advocate and author of All the Things We Never Knew: Chasing the Chaos of Mental Illness, wants the public to hear survivors’ stories about both losing someone to suicide and surviving an attempt. Data, she says, isn’t a history of childhood trauma or a story about how the health care system fails its most vulnerable patients. Data can’t conquer the stigma that keeps people from reaching out for help and treatment.
You are never alone. We are here to help 24/7. Call 1-800-273-TALK (8255) if youre thinking about suicide.
Lifeline (@800273TALK) June 5, 2016
Some also worry that the kind of technology Coppersmith and his peers are pursuingconstitute a significant invasion of privacy.In 2014, a UK suicide prevention nonprofit called Samaritans Radar launched a Twitter app that used an algorithm to search for high-risk key words and then emailed the user with resources for help. Within 10 days of its debut, outrage over privacy concerns led Samaritans Radar to suspend the app.
“More tools will be available to help you figure out what is driving this pain, to find ways out of it.”
The best solution for now, Coppersmith believes, is to only analyze data from individuals who opt-in to a service. The most vulnerable might not choose to participate, but an opt-in safeguard would protect a person’s civil rights and prevent cases of abuse, like embarrassment or harassment at the hands of a friend who purportedly cares about your mental health.
Despite these challenges, Coppersmith remains hopeful. He believes that digital data can be empowering when used to detectthe suffering embedded in our everyday communication; that information has the potential to set people on a path to healing with the right support and treatment.
Coppersmith envisions a future in which people in the throes of despair don’t feel as helpless.
“More tools will be available to help you figure out what is driving this pain, to find ways out of it,” he says, “and steer you out of it in the future.”
If you want to talk to someone or are experiencing suicidal thoughts, text the Crisis Text Lineat 741-741 or call the National Suicide Prevention Lifeline at 1-800-273-8255. Here is alistof international resources.
Have something to add to this story? Share it in the comments.