Commentary

Chatbots can improve mental health in vulnerable populations


 

In this modern age of health care where telemedicine rules, conversational agents (CAs) that use text messaging systems are becoming a major mode of communication.

Sammi Wong, a medical student at New York Institute of Technology College of Osteopathic Medicine in Old Westbury, conducts research related to mental health care services.

Sammi Wong

Many people are familiar with voice-enabled agents, such as Apple’s Siri, Google Now, and Microsoft’s Cortana. However, CAs come in different forms of complexity, ranging from a short message service–based texting platform to an embodied conversational agent (ECA).

ECAs allow participants to interact with a physical or graphical figure that simulates a person in appearance, behavior, and dialect. These are essentially virtual humans, or avatars, who talk with participants. By taking greater advantage of these automated agents, some have projected there may be $11 billion in combined cost savings across a variety of business sectors by 2023.1 The health care field is one sector in which CAs can play an important role. Because of their accessibility, CAs have the potential to improve mental health by combating health care inequities and stigma, encouraging disclosure from participants, and serving as companions during the COVID-19 pandemic.

CAs provide accessible health care for rural, low socioeconomic status (SES), and minority communities in a variety of advantageous ways. For example, one study found that long-term use of a text-based agent that combines motivational interviewing and cognitive-behavioral therapy (CBT) can support smoking cessation in adolescents of low SES.2

CAs can help vulnerable participants advocate for themselves and proactively maintain their mental health through access to health care resources. In specific cases, these agents equalize health care treatment for different populations. Even though some participants live in secluded areas or are blocked by barriers, these text-based agents can still provide self-help intervention for them at any time on an individual basis, regardless of their location or socioeconomic status. Furthermore, they serve as highly cost-effective mental health promotion tools for large populations, some of which might not otherwise be reached by mental health care.

In combating mental illnesses such as depression and anxiety, studies have found that CAs are great treatment tools. For example, participants in an experimental group who received a self-help program based on CBT from a text-based CA named Woebot experienced significantly reduced depression symptoms when compared to the control group of participants, who received only information from a self-help electronic book.3 As a result, CAs might prove successful in treating younger populations who find online tools more feasible and accessible. Often, this population self-identifies depressive and anxiety symptoms without consulting a health care professional. Thus, this tool would prove useful to those who are bothered by the stigma of seeing a mental health professional.

Virtual human–based CAs also encourage participants to disclose more information in a nonjudgmental manner, especially among people with diseases with stigma. CAs use neutral languages, which may be helpful when dealing with stigmatized issues such as HIV, family planning, and abortion care because this heightens confidentiality and privacy. When participants believe that the agent does not “judge” or evaluate their capabilities, this elicits more sensitive information from them. For example, one study found that military service members who believed that they were interacting with a computer rather than a human operator reported lower fear of self-disclosure, displayed more sadness, and were rated by observers as more willing to disclose posttraumatic stress disorder symptoms.4 Additional findings show that participants prefer CAs when topics are highly sensitive and more likely to evoke negative self-admissions.

In what we hope will soon be a post–COVID-19 landscape of medicine, CAs are fast being used on the front lines of health care technology. Empathetic CAs can combat adverse effects of social exclusion during these pressing times. Etsuko Ishii, a researcher affiliated with the Hong Kong University of Science and Technology, and associates demonstrated that a virtual CA was as effective as a COVID-19 companion because it uses natural language processing (NLP) and nonverbal facial expressions to give users the feeling that they are being treated with empathy.5 While minimizing the number of in-person interactions that could potentially spread COVID-19, these agents promote virtual companionship that mirrors natural conversations and provide emotional support with psychological safety as participants express their pent-up thoughts. Not only do these agents help recover mood quickly, but they also have the power to overcome geographic barriers, be constantly available, and alleviate the high demand for mental health care. As a result, CAs have the potential to facilitate better communication and sustain social interactions within the isolated environment the pandemic has created.

CAs can predict, detect, and determine treatment solutions for mental health conditions based on behavioral insights. These agents’ natural language processing also allows them to be powerful therapeutic agents that can serve different communities, particularly for populations with limited access to medical resources. As the use of CAs becomes more integrated into telemedicine, their utility will continue to grow as their proven versatility in many situations expands the boundaries of health care technology.

Ms. Wong, a medical student at New York Institute of Technology College of Osteopathic Medicine in Old Westbury, conducts research related to mental health care services. She disclosed writing a telemental health software platform called Orchid. Dr. Vo, a board-certified psychiatrist, is the medical director of telehealth for the department of child and adolescent psychiatry and behavioral sciences at Children’s Hospital of Philadelphia. She is a faculty member of the University of Pennsylvania, also in Philadelphia. Dr. Vo conducts digital health research focused on using automation and artificial intelligence for suicide risk screening and connecting patients to mental health care services. She disclosed serving as cofounder of Orchid.

References

1. Chatbots: Vendor opportunities & market forecasts 2020-2024. Juniper Research, 2020.

2. Simon P et al. On using chatbots to promote smoking cessation among adolescents of low socioeconomic status, Artificial Intelligence and Work: Association for the Advancement of Artificial Intelligence (AAAI) 2019 Fall Symposium, 2019.

3. Fitzpatrick KK et al. JMIR Mental Health. 2017;4(2):e19.

4. Lucas GM et al. Front Robot AI. 2017 Oct 12. doi: 10.3389/frobt.2017.00051.

5. Ishii E et al. ERICA: An empathetic android companion for COVID-19 quarantine. arXiv preprint arXiv:2106.02325.

Recommended Reading

FDA inaction on hair loss drug’s suicide, depression, erectile dysfunction risk sparks lawsuit
Federal Practitioner
CBT via telehealth or in-person: Which is best for insomnia?
Federal Practitioner
Nature versus nurture: Seasonal affective disorder
Federal Practitioner
‘Empathy fatigue’ in clinicians rises with latest COVID-19 surge
Federal Practitioner
Relapse risk increased with antidepressant discontinuation
Federal Practitioner
Depression rates up threefold since start of COVID-19
Federal Practitioner
Anxiety, depression symptoms rose and fell with new COVID cases
Federal Practitioner
Substance use or substance use disorder: A question of judgment
Federal Practitioner
Sleep problems in mental illness highly pervasive
Federal Practitioner
Pandemic drives uptick in need for mental health services
Federal Practitioner