Research shows that 90% of digital mental health tools are no longer used after two weeks. Many companies are working on profoundly understanding engagement patterns in apps to improve retention. What if specific user engagement patterns trended towards improvement in symptoms of depression or anxiety. Could we nudge people toward a particular use path so they could get relief sooner? Digital mental health tools are maturing, and some new data suggests there are indeed patterns of use that lead to improvement.
A new paper in JAMA Network Open by Isabel Chien and colleagues from Microsoft Research Cambridge leverages machine learning methods to understanding user engagement. One of the first challenges in the digital therapeutics space is understanding the mechanisms of action; unlike a drug, whose clinical mechanisms is mapped in the phased clinical trials a drug follows for approval, mechanisms in digital tools are less clear. This study looked at the internet-delivered Cognitive Behavioral Therapy (iCBT) program “Escape from Depression and Anxiety” in 56,604 adult users. The program is one of a suite available on a platform developed by Silvercloud. User login data and interactions within the clinician supported program were analyzed to explore user engagement over time. Clinical outcomes collected as part of the program include depression and anxiety assessments, the PHQ-9 and GAD-7, respectively.
Findings indicate that, on average, users spend 111 minutes in the program completing 230 tools that support CBT principles; these include exercises on negative thought restructuring. On average, participants scored on the moderate symptom range for depression and anxiety at baseline.
The authors identified five subtypes of engagement in program users over the 14 weeks. Class 1 are “low engagers,” which make up 36% of the sample, Class 2, “late engagers” make up 21% of the sample. Class 3, “higher engagers with rapid disengagement,” are 25% of the sample, Class 4, higher engagers with moderate decrease,” a further 5%. Class 5, “highest engagers,” comprise 10% of the total users. Symptom reduction varied by subtype with Class 3, 5, and 4 showing the most significant improvement; class 2 had the lowest symptom reduction rate. Recovery was not always proportional to the time spent within the program; put another way, the dose-response isn’t linear, and more use does not necessarily mean a better outcome. Looking more deeply into the use patterns of Class 4, we see they engaged in more goal-based and mood tracking activities in addition to care modules. Class 5 accessed fewer care modules but used more of the mindfulness-based activities. Patients in Class 3, the group that showed a six-point drop in their depression score, completed tools that reflected core concepts of CBT like behavioral activation and cognitive restructuring within the first two weeks of accessing the program. The engagement curves and drop outs can seen in the graphs below.
This paper sheds light on the fact that were are many paths to improvement, and user preference for the content and tools accessed is vital for symptom improvement. It also suggests there may be a way to nudge participants toward the core tools of CBT within the first two weeks of starting the program while allowing flexible access to other modules like mindfulness in later weeks. It should also be noted that the most significant improvement in class 3 was a 6.65 drop in PHQ 9 compared to class 4, which was 5.39, a 1.26 difference.
Allowing for personalization is both a novel and a significant element in digital therapeutics. Applying machine learning methods will enable us to see who gets better, at what rate, and by what means, which has implications for how a clinician can frame these tools for optimal use. It also shows that a one size fits all would be too rigid, yet leaning into the tenets of CBT early on a treatment plan can support more considerable improvement. The user may not be ready to address their negative thought patterns from the outset, so having the flexibility to start with mindfulness is vital for engagement even if some improvement is sacrificed. A more liner rigid program would likely lose this user early, and no benefit would be derived.
Thanks for reading – Trina
(Opinions are my own)
References
Chien I, Enrique A, Palacios J, et al. A Machine Learning Approach to Understanding Patterns of Engagement With Internet-Delivered Mental Health Interventions. JAMA Netw Open. 2020;3(7):e2010791. doi:10.1001/jamanetworkopen.2020.10791