Artificial Intelligence (AI) is a burgeoning field with many applications. In the past few years, AI-driven conversational agents have been used in many industries, and the rise of their use in healthcare is evident. Web-based interventions for health conditions have been around for over two decades. The next evolution is leveraging app-based conversational agents, who are always on, in supporting people manage their health care via awareness, skill-building, and self-soothing.
A new paper by Judith Prochaska and colleagues from Stanford University and Woebot in the Journal of Medical Internet Research explores the use of conversational agents in reducing behaviors associated with substance use. The authors reflect on the reality that less than 20% of people needing care for substance use receive care. Misuse of alcohol and drugs is prevalent in the USA, and estimates project it costs $440 billion annually; these costs relate to lost lives, trauma, violence, workplace productivity, and health care expenses.
Digital support provides an opportunity to decant complex treatments delivered at the moment of need. Early data suggests that digital interventions can impact substance use behaviors, and most of these have been delivered via web-based interventions. More recently, apps have been coming to market to support individuals dealing with substance use disorder to impact some of the behaviors associated with substance use.
Conversational agents can model coach and sponsor-like interactions, and they are available 24/7, which may free up clinicians’ time to offer more intensive interventions in a comprehensive plan of care. The current study examines Woebot, a conversational agent, as an intervention for substance use disorder (SUD). One hundred and one participants participated in this feasibility trial, the average age of the group was 36.8 years old, the sample was predominantly female (75.2%) and non-Hispanic White (78.2%). The trial ran for eight weeks and used a pre/post-single-group design in individuals with SUDs.
Throughout the study, participants sent 600 messages, completed 12.1 psychoeducational modules, and engaged over 15 days. The majority of the psychoeducational content was rated positively (94%). Findings suggest Woebot, as a conversational agent, was able to deliver engaging content that showed significant improvements in pre/post-self-reported measures of substance use, depression, anxiety, cravings, and confidence. Like many mental health apps, engagement was high initially and dropped off over the eight weeks of treatment. There is no standard definition of engagement, and I have addressed use-response curves in prior blog posts, to-date we don’t have clear-cut cutpoints for therapeutic thresholds in digital mental apps.
This study provides early data on the value of conversational agents in providing support and delivering engaging psychoeducation. Previous studies suggest mental health apps lose 95% of users by day 15. The ability of relational agents to keep someone participating is promising. In digital health, we often have crude measures of engagement like the number of sessions or texts sent. The path ahead is leaning into that data to tease out the use patterns associated with healing and a reduction in symptoms.
The value of supporting someone on their journey back from SUD with new methods to connect with a conversational agent to help manage triggers and cravings 24/7 is promising. This feasibility trial shows great opportunity for future development, which may translate into more people having access to solutions that can heal.
Thanks for reading – Trina
(Opinions are my own)
Prochaska JJ, Vogel EA, Chieng A,Kendra M, Baiocchi M, Pajarito S,Robinson A. A Therapeutic Relational Agent for Reducing Problematic Substance Use (Woebot): Development and Usability Study