Since its founding in 2013, the free mental health support service Crisis Text Line has focused on using data and technology to better aid those who reach out for help.
Unlike helplines that offer assistance based on the order in which users dialed, texted, or messaged, Crisis Text Line has an algorithm that determines who is in most urgent need of counseling. The nonprofit is particularly interested in learning which emoji and words texters use when their suicide risk is high, so as to quickly connect them with a counselor. Crisis Text Line just released new insights about those patterns.
Based on its analysis of 129 million messages processed between 2013 and the end of 2019, the nonprofit found that the pill emoji, or 💊, was 4.4 times more likely to end in a life-threatening situation than the word suicide.
Other words that indicate imminent danger include 800mg, acetaminophen, excedrin, and antifreeze; those are two to three times more likely than the word suicide to involve an active rescue of the texter. The loudly crying emoji face, or 😭, is similarly high-risk. In general, the words that trigger the greatest alarm suggest the texter has a method or plan to attempt suicide or may be in the process of taking their own life.
Crisis Text Line has a list of 100 terms that are more high risk than the word suicide. Unexpected terms include vampire, which texters use to describe looking normal on the outside but feeling sick inside, or to say they’ve been called an “emotional vampire”; blvd, the abbreviation for boulevard, which shows up partly when texters name a location where they’re at immediate risk of harm; and 11:11, a number that had no clear pattern.
This isn’t Crisis Text Line’s first attempt to understand how people communicate via text when they’re suicidal. In 2017, the nonprofit used artificial intelligence to analyze 22 million messages; they found that the word ibuprofen was 16 times more likely to predict the person texting would need emergency services than the word suicide. Since Crisis Text Line has far more messages to analyze than it did in 2017, the word ibuprofen remains high-risk, but is not as predictive as the pill emoji.
Bob Filbin, chief data scientist at Crisis Text Line, is hopeful that artificial intelligence is sharpening the nonprofit’s ability to detect suicide risk faster and more accurately. When the algorithm flags a message as high-risk, the conversation is coded orange so counselors immediately know to ask whether the user has a plan or method in place, among other questions.
Other words that indicate imminent danger include 800mg, acetaminophen, excedrin, and antifreeze.
When Crisis Text Line set out to learn more about the content associated with suicide risk, it began in 2015 with a list of 50 words identified by academics as high-risk, checking to see whether texters used the same words in conversations that already happened. Then it deployed an algorithm to see which words or emoji appeared uniquely in conversations with texters that ended in an active rescue. That’s when ibuprofen emerged as a top candidate. (All of Crisis Text Line’s data is anonymized.)
The updated algorithm used to analyze the 129 million messages considered not only the most frequently used words, phrases, and emoji, but also looked at the context of the conversation. So if the phrase “kill myself” appears, for example, the algorithm is designed to consider whether the rest of the sentence further increases risk (I want to kill myself) or negates the risk (I don’t want to kill myself).
Compared to the general population, Crisis Text Line users skew young, low-income, and high-risk, but the AI-generated insights are promising.
Lindsey C. McKernan, an assistant professor in the department of psychiatry and behavioral sciences at Vanderbilt University Medical Center who has written about using artificial intelligence in suicide prevention, said in an email that Crisis Text Line’s findings could be helpful.
“New research on texting’s role in suicide prediction has the potential to provide us another window or ‘sign’ to attune to as a family member, friend, or clinician interacting with someone under these circumstances,” McKernan wrote. (Common warning signs include giving away possessions or having sleep and mood changes.)
Americans are suffering and we know why. With over 129 million text messages, we created Everybody Hurts: The State of Mental Health to bust myths and set facts straight about mental health in the US. See what’s causing crisis in your state:https://t.co/ifTs6Hj0Mj
— Crisis Text Line (@CrisisTextLine) February 10, 2020
But it’s important to remember that algorithms can be wrong sometimes, inaccurately classifying someone as high-risk or indicating that someone is low-risk when they’re not, McKernan said.
“Examining texting patterns could give us one more piece of information to inform suicide risk predictions and a prevention strategy for de-escalation, particularly for younger individuals at risk of suicide,” she wrote.
Filbin said texting allows people, especially younger users, a way to be vulnerable during difficult moments.
“Part of the reason why we see these words is because text is a digital medium where people end up being particularly honest, and honest faster,” said Filbin. “Texting can be uniquely powerful for young people to talk about their crises.
In turn, artificial intelligence helps Crisis Text Line better understand when those emergencies require immediate care and attention.
If you want to talk to someone or are experiencing suicidal thoughts, Crisis Text Line provides free, confidential support 24/7. Text CRISIS to 741741 to be connected to a crisis counselor. You can call the National Suicide Prevention Lifeline at 1-800-273-8255. Here is a list of international resources.