A chatbot helped more people access mental-health services

发布时间:2024/2/4 来源:MIT Technology Review
The Limbic chatbot, which screens people seeking help for mental-health problems, led to a significant increase in referrals among minority communities in England.
the Limbic Ai logo emerges from a cell phone in a person's hand

An AI chatbot helped increase the number of patients referred for mental-health services through England’s National Health Service (NHS), particularly among underrepresented groups who are less likely to seek help, new research has found.

Demand for mental-health services in England is on the rise, particularly since the covid-19 pandemic. Mental-health services received 4.6 million patient referrals in 2022—the highest number on record—and the number of people in contact with such services is growing steadily. But neither the funding nor the number of mental-health professionals is adequate to meet this rising demand, according to the British Medical Association.  

The chatbot’s creators, from the AI company Limbic, set out to investigate whether AI could lower the barrier to care by helping patients access help more quickly and efficiently.

A new study, published today in Nature Medicine, evaluated the effect that the chatbot, called Limbic Access, had on referrals to the NHS Talking Therapies for Anxiety and Depression program, a series of evidence-based psychological therapies for adults experiencing anxiety disorders, depression, or both.  

It examined data from 129,400 people visiting websites to refer themselves to 28 different NHS Talking Therapies services across England, half of which used the chatbot on their website and half of which used other data-collecting methods such as web forms. The number of referrals from services using the Limbic chatbot rose by 15% during the study’s three-month time period, compared with a 6% rise in referrals for the services that weren’t using it.  

Referrals among minority groups, including ethnic and sexual minorities, grew significantly when the chatbot was available—rising 179% among people who identified as nonbinary, 39% for Asian patients, and 40% for Black patients. 

Crucially, the report’s authors said that the higher numbers of patients being referred for help from the services did not increase waiting times or cause a reduction in the number of clinical assessments being performed. That’s because the detailed information the chatbot collected reduced the amount of time human clinicians needed to spend assessing patients, while improving the quality of the assessments and freeing up other resources.

It’s worth bearing in mind that an interactive chatbot and a static web form are very different methods of gathering information, points out John Torous, director of the digital psychiatry division at Beth Israel Deaconess Medical Center in Massachusetts, who was not involved in the study.

“In some ways, this is showing us where the field may be going—that it’ll be easier to reach people to screen them, regardless of the technology,” he says. “But it does beg the question of what type of services are we going to be offering people, and how do we allocate those services?”

Overall, patients who’d used the chatbot and provided positive feedback to Limbic mentioned its ease and convenience. They also said that the referral made them feel more hopeful about getting better or helped them know they were not alone. Nonbinary respondents mentioned the non-human nature of the chatbot more frequently than patients who identified as male or female, which may suggest that interacting with the bot helped avoid feelings of judgment, stigma, or anxiety that can be triggered by speaking to a person.

“Seeing proportionally greater improvements from individuals in minority communities across gender, sexual, and ethnic minorities, who are typically hard-to-reach individuals, was a really exciting finding,” says Ross Harper, Limbic’s founder and CEO, who coauthored the research. “It shows that in the right hands, AI can be a powerful tool for equity and inclusion.”

Visitors to the chatbot-enabled websites were met with a pop-up explaining that Limbic is a robotic assistant designed to help them access psychological support. As part of an initial evidence-based screening process, the chatbot asks a series of questions, including whether the patient has any long-term medical conditions or former diagnoses from mental-health professionals. It follows these with multiple questions designed to measure symptoms of common mental-health issues and anxiety, tailoring its questioning to the symptoms most relevant to the patient’s problems.

The chatbot uses the data it collects to create a detailed referral, which it shares with the electronic record system the service uses. A human care professional can then access that referral and contact the patient within a couple of days to make an assessment and start treatment.

Limbic’s chatbot is a combination of different kinds of AI models. The first uses natural-language processing to analyze a patient’s typed responses and provide appropriate, empathetic answers. Probabilistic models take the data the patient has entered and use it to tailor the chatbot’s responses in line with the patient’s most likely mental-health problem. These models are capable of classifying eight common mental-health issues with 93% accuracy, the report’s authors said.

“There aren’t enough mental-health professionals, so we want to use AI to amplify what we do have,” adds Harper. “That collaboration between human specialists and an AI specialist—that’s where we’ll really solve the supply-demand imbalance in mental health.”

AI吧交流-微信