AI doesn’t replace clinical judgment.
“AI can capture data that we miss due to the limits of our humanity,” psychologist Dr. Miller says. “There’s suicide prevention processes founded on big data and AI, and there are processes founded in clinical intuition and acumen.”
AI is only as good as the data it’s based on. If that data lacks diversity, it may miss things. And variables that apply to veterans may differ in civilians.
Stopping suicidal thoughts
Google is putting AI to work against suicide, too. Its MUM (Multitask Unified Model) technology seeks to understand the intent behind what we google.
MUM powers Google Search. It can often tell the difference between a search for information about suicide for someone writing a research paper on the topic and a search for information on how or where to carry out a suicide.
When Google Search detects that someone in the United States might be in crisis and at risk of suicide, the first search results that person gets are the number for the National Suicide Prevention Lifeline and other resources for people in crisis.
Google Home Assistant works in the same way. When a user makes a query that signals a suicide-related crisis, the gadget serves up resources that offer help.
MUM is working to understand the nuances of crisis language in 75 languages so that Google Search can provide people in crisis with hotlines or other resources in many countries.
“We want to find partners that are accessible to users in terms of hours of operation. We have a strong preference for finding partners that promise confidentiality and privacy to the extent that those are permitted [in that country],” says Anne Merritt, MD, a product manager at Google Search.
Other companies are working on apps that use AI to spot suicide risk in other ways, including voice technology that may notice subtle changes in the voice of someone who’s depressed and may be thinking of suicide. Those are still in development but show promise. Keep in mind that apps do not require government approval, so if you try one, be sure to let your health care provider know.
Changing the channel
Seeing a hotline number on your phone or computer screen can help, Dan Miller says. “If I happened to be online, searching maybe for a bridge to jump off of ... and suddenly that pops up on the screen, it’s like it changes the channel.”
It may not work for everyone, he says, but that search result could interrupt someone’s suicidal train of thought.
That’s crucial, psychologist Dr. Miller says, because most suicide attempts escalate from first thought to potentially fatal action in just 1 hour. That’s how fast it happened for Dan Miller in 2014.
“When you’re able to put time and space between the suicidal thought and the access to the method to act on that thought, you save lives,” Dr. Bernert says.
Making a different choice
An interruption in Mr. Miller’s thinking is what had saved his life.
Holding the gun to his head, Mr. Miller looked over at the passenger seat at a brochure from Wounded Warrior Project, which he had just learned about. Mr. Miller noticed a photo of a man in a wheelchair, a veteran like him, who had no legs. He thought that the man looked worse off than him but hadn’t given up.
Mr. Miller put down his gun and decided to get help.
Recovering from a near suicide attempt, he says, is a journey. It doesn’t happen overnight. Now, 8 years later, Mr. Miller is planning a brief break from the speaker circuit. He plans to spend 2 weeks in an outpatient counseling program for posttraumatic stress disorder and traumatic brain injury.
“Telling my story to strangers – part of it is healing me in a way, but I’m learning that repeating the story over and over again is also keeping me from letting it go. And I’m still healing.”
A version of this article first appeared on WebMD.com.