How the VA uses algorithms to predict suicide

The VA believes that its algorithms have reduced suicides by vulnerable veterans.

The Department of Veterans Affairs is using artificial intelligence to figure out which veterans are in critical need of mental health treatment as part of a massive effort to stem suicide in its ranks, a top priority of President Donald Trump and his VA leadership.

A computer program scours millions of records for medications, treatment, traumatic events, overall health and other information, and based on prior experience, it plucks out the names of veterans most likely to die by suicide in the next year. Clinicians then reach out to them directly, sometimes before the patient has expressed suicidal thoughts to anyone.

The VA believes that its algorithms have reduced suicides by vulnerable veterans. Since the VA adopted the technology in 2017, about 250 fewer veterans have died by suicide than the would have been expected based on the previous rate, according to VA estimates. It's not clear how big a role the algorithms played in the apparent decline. A 2018 VA report shows veteran suicides decreased from 6,281 in 2015 to 6,079 in 2016.

Still, about 20 former and current service members die by suicide each day, about six of whom have been in VA health care, according to the Department. The veteran suicide rate is about 22 percent higher than that of the general population.

As researchers wrestle with how to interpret and act on the risk predictions, they’re finding it isn’t easy to parse the factors leading to suicide, and worry they might be flagging the wrong patients, leading to special handling the veterans might not need or want. There’s also concern that being flagged as at risk of suicide could lead to discrimination against patients if leaked to employers or insurers.

The systems digging into patient records to examine the different elements that contribute to suicide risk sometimes turn up surprising leads. For example, there is evidence that patients are at higher risk of suicide in the three- to six- months of starting an opioid prescription — and again in the three- to -six-month window after they’re weaned off the pain pills.

VA officials say it’s not yet clear what to do with that information, but it could have profound significance as the country struggles with the opioid crisis. The VA aggressively started cutting back opioid medications about a decade ago in its patients, earlier than doctors in the rest of U.S. medicine.

The thought patterns leading to suicide aren’t well-understood, and suicidality can be a temporary psychological state that patients come in and out of, mental health experts say. That makes it hard for both humans and computers to predict.

Clinicians often flag patients at risk of suicide based on answers to questions like, “In the past few weeks, have you wished you were dead?” But some patients who talk openly about wanting to die never follow through, and others who answer the question negatively make an impulsive decision to take their lives, said Lisa Horowitz, a suicide prevention expert specializing in screening at the National Institute of Mental Health. “Every point on that continuum has been linked to completed suicide,” she said.

While recognizing the downside of an inaccurate algorithm, she said accurate predictions could help clinicians get around the fact that some patients hide suicidal thoughts from doctors—or may not even be aware of it.

At the VA, the predictive algorithms are an imperfect solution to the department’s inability to reach out to each of its 9 million patients each year. Sorting patients by risk helps clinicians focus their attention on the ones who may need it most, said Jodie Trafton, director of the VA’s Program Evaluation and Resource Center within the Office of Mental Health and Suicide Prevention.

The suicide prediction algorithm, called REACH VET, only culls patient records when they arrive at a VA facility. The system has flagged about 31,000 patients for follow-up calls with clinicians since 2017; during these interventions, a clinician might prompt a patient to seek therapy or make a suicide prevention plan.

The VA’s project is of deep interest to suicide prevention specialists in community health care systems. More than 47,000 Americans died by suicide in 2017, according to the CDC’s most recent data; the rate increased about 2 percent a year between 2006 and 2017.

Veteran suicide has drawn particular interest from the Trump administration. In March, the president signed an executive order establishing a task force co-chaired by VA secretary Robert Wilkie and including leaders from the White House, HHS, and the departments of Homeland Security, Defense and Energy. Wilkie and former VA secretary David Shulkin both called veteran suicide their top clinical priority. Rep. Seth Moulton, a Marine veteran running for president, has proposed annual mental health check-ins for veterans and returning combat veterans.

REACH VET pulls a list of the top 0.1 percent at any facility who it deems most likely to die by suicide in the next 12 months. According to VA’s historical statistical models, those people are 40 times more likely than the typical VA patient to die by suicide; the top 0.01 percent are 140 times more likely to die by suicide.

Clinicians are then instructed to call high-risk patients and offer to help them create a mental health care plan, spending about an hour with them. These patients are more likely to show up for appointments, have fewer mental health bed stays, and lower all-cause mortality than patients with similar characteristics before REACH VET was implemented, Trafton says. Those patients also had five more days of outpatient mental health visits in the six months after they received the intervention.

The VA is still refining its predictive models. The first version of REACH VET assessed risk based on predictors like mental health diagnoses in the past year; the team is now investigating more sophisticated data points, like changes in dosage for medications, including opioids. It’s also using powerful Energy Department supercomputers to mine clinical notes that might mention personal issues, Trafton said. “Maybe they’re having relationship problems, or financial problems,” she said.

Another algorithm, the VA’s Stratification Tool for Opioid Risk Mitigation, or STORM, analyzes patients prescribed opioids for risk of overdose or suicide. The risks of suicide at the onset and end of opioid use seem to be increased, Trafton said.

“We don’t know whether that has to do with the decision to deprescribe, or whatever [else] caused the patient to stop” taking the medication, she said. “We’re very aware that if we try to get people off [opioids], we need to make sure we do that in a very supportive, patient-centric environment.”

It’s hard to disentangle opioid use from suicidality, “because you’d get a lot of cases where they overdosed, but intentionality was very hazy,” she said.

Officials at the VA view the algorithms as a good way to focus limited resources.

“It’s already taxing our clinical resources to reach the 0.1 percent of veterans” flagged by REACH VET, Trafton said. Researchers are debating how and whether to tweak the model, because “we’re still calling a whole bunch of people who aren’t actually going to wind up killing themselves.”

The system doesn’t automatically trigger hospitalization or any other involuntary treatment, she said, yet even those who aren’t suicidal often benefit from a follow-up phone call to deal with various troubles. The VA recently began mandatory suicide screenings for all patients entering VA facilities, she said, so it’s unlikely that an acutely suicidal patient would not be flagged, either by the screening or the algorithm.

Flagging the wrong patients

Artificial intelligence and automated screening tests should be used with caution, Horowitz said. An algorithm might not understand that a patient who says “I wish I were dead” isn’t necessarily suicidal. An overreaction could lead to involuntarily hospitalization, and “there’s so much stigma around mental health issues to begin with, and suicide in particular.”

The VA doesn’t obtain specific consent from patients before running their records through its prediction system, but clinicians are instructed to tell patients that they’ve been flagged. The prediction shows up in medical records, but it’s clearly labeled as a prediction as opposed to a diagnosis, Trafton said. That information is protected within the health record, so outside groups such as employers and insurers shouldn’t be able to access it, under HIPAA law and VA security protocols, she said.

What worries her more are the risk prediction models that private sector health plans and insurance companies are spinning up to assess patients’ prescriptions and mental health usage in order to move them to different insurance classes. Her team has been alerting Congress and other legislative bodies about ways risk assessments might be abused in the private sector, she said.

“We don’t want people to be dinged for accepting preventive interventions,” like learning coping skills, she added.

Even if an algorithm can reliably sort out the patients who will follow through from the ones who won’t, it’s not clear what the best medical response is for the high-risk patients, says Ron Kessler, a Harvard professor who was part of the team that developed the first REACH VET machine learning model.

Automated predictions should only be one factor in a clinicians’ analysis, he said; and risk assessments aren’t useful if the clinician doesn’t have a clear guide for what to do with that information.

Hospitalization, for instance, can be traumatic for some patients. “If you’re doing something to them that can restrict their activity [and] that can affect their career ... maybe you could do more harm than good,” he said.

Kessler and a group of other researchers are using AI to scan hundreds of thousands of patient records to figure out exactly what kind of treatment suits each individual patient — whether it’s hospitalization, outpatient treatment, or intensive post-hospital case management. “That’s where the future of suicide prediction modeling is,” he said.

If you or someone you know is in crisis, please call the National Suicide Prevention Lifeline at 1-800-273-TALK (8255) or contact the Crisis Text Line by texting HELLO to 741-741.