Tech

AI tools could ease caseload of therapists feeling burnt out


AI (Artificial Intelligence) concept. Deep learning. Mindfulness. Psychology.

metamorworks/Getty Images

For the US and around the world, the last few years have been especially intense, to say the least. Therapy is in high demand as more people, especially youth, suffer from mental health issues. The wake of the COVID-19 pandemic and an ensuing loneliness epidemic have left therapists stretched thin. The mental health industry is significantly understaffed, making support even less accessible.

Direct-to-consumer (DTC) teletherapy companies like BetterHelp and Talkspace have emerged to fill in the gaps. While this shift has solved some problems, it has also created new challenges for therapists. As a May 2024 Data & Society report details, providers have had to learn how to conduct sessions virtually, navigate new patient portals, and adapt to new tools. The report also found that many therapists feel exploited by the platforms’ tendency to structure their labor like gig work.

Though these DTC options are designed to serve consumers, therapists need support, too. A 2023 American Psychological Association (APA) survey found that due to increased workload during the pandemic, 46% of psychologists reported being unable to meet demand in 2022 (up 16% from 2020), and 45% reported feeling burnt out. 

AlsoHooking up generative AI to medical data improved usefulness for doctors

Could artificial intelligence (AI) tools be a solution?

Notetaking and documentation 

A therapist’s day-to-day involves more than just conducting sessions: providers also manage scheduling and organization, including maintaining their patients’ electronic health records (EHR). Several therapists who spoke with ZDNET said EHR maintenance is one of the hardest parts of their job. 

Like most applications of AI for work and productivity, many AI tools for therapists aim to offload administrative work for stretched providers. Several tools also use AI to analyze patient data and help therapists notice nuances in progress or mental state. 

This is where Health Insurance Portability and Accountability Act (HIPAA)-compliant AI notetakers can come in. One such tool, Upheal, runs in a therapist’s browser or mobile device and listens to sessions in person, virtually via platforms like Zoom, or in the Upheal app. Providers can select from templates for individual or couple sessions, and Upheal will record session notes in the appropriate format. Once the provider reviews and finalizes the notes, they can be moved into the therapist’s existing EHR platform. 

On top of basic transcription, Upheal’s AI provides additional insights and data, and can suggest treatment plans based on what it overhears. The company’s website assures it is compliant with multiple health data regulations, including HIPAA and GDPR

While plenty of digital EHR services like TherapyNotes exist, AI streamlines the notetaking process. Rather than typing and then analyzing notes post-session, Upheal lets therapists dedicate all their attention to their clients. It also supports neurodivergent therapists for whom paperwork can be especially challenging. 

For Alison Morogiello, a licensed professional counselor based in Virginia, Upheal reduced her fatigue around writing session notes. “I love working with people, but not as much working with documentation,” she explains. “The way I collect information made it very difficult to conceptualize the therapy work that I had done, how the client was responding to the interventions — to condense it into a summary note was very challenging for me, and often very tedious.” 

Also: These 7 tech products helped us find inner peace

Morogiello is busy — she sees up to 30 patients a week. When she opened her own practice, her goal was to work more efficiently, maintain a better work-life balance, and ultimately be more present with her clients — all of which Upheal is making possible. After initially doubting how secure and effective it was, she has now been using Upheal for several years. 

“As a psychotherapist, you witness a lot of struggles — pain, grief, frustration, anxiety — so to sit back at the end of the day or after a session and conceptualize it from a therapeutic lens takes a lot of emotional effort,” she says. “To have a program do that emotional work for me, to synthesize the information, pull out what’s important — I don’t have to go back and relive sessions.” Upheal keeps her from expending herself patient to patient. 

Morogiello reviews all of Upheal’s notes to ensure they’re consistent with her assessment of the session. She added that Upheal’s AI helps her catch insights she might have missed, including how much she speaks compared to her client or how quickly they speak, which could indicate altered states like hypomania. 

Also: How Gen AI got much better at medical questions – thanks to RAG

Especially while juggling so many clients, Morogiello thinks of Upheal as an assistant that gives her feedback she can implement to improve her skills. She also says it’s improved her workflow without disruption. “I don’t take notes during sessions anymore, because the notes are kind of taken for me, unless I’m doing any kind of intervention that requires me to write something down,” she explains. “Me practicing in the therapeutic room hasn’t changed, other than me being more present.” 

Administrative support 

Therapy’s effectiveness isn’t limited to active sessions. AI tools can help maintain patient progress between appointments, allowing therapists to go deeper one-on-one. Conversational AI chatbots like Woebot and Wysa use psychology research to provide users with in-the-moment mental health support and homework exercises. Because of their on-demand availability, they are intended to supplement or precede provider-based care. Like triage for therapy, they can theoretically lower the influx of session requests for therapists.

Accessible to people already under the guidance of a provider, Woebot uses cognitive behavioral therapy (CBT) strategies to engage with and address whatever a user wants to discuss via its messaging app. Designed for clinicians, Woebot Health’s overall platform also collects patient-reported data and helps providers determine treatment plans. 

Wysa’s chatbot, also based in CBT techniques, specifically helps onboard people into therapy. Jumping directly into a session with a therapist might be intimidating for new patients; by contrast, a chatbot can feel a little less formal and, therefore, more accessible. Wysa can also connect users to therapists through its platform if and when they’re ready. 

Matt Scult, a New York-based CBT therapist, thinks Woebot and Wysa are great homework tools for clients to use between sessions. “They do a really nice job of guiding people through cognitive exercises in a conversational way, helping people to identify cognitive distortions and reframe their thoughts in a way that’s much more engaging than the traditional thought log.” This might seem primarily beneficial for patients, but it also helps providers maximize their session momentum. 

Also: 3 ways AI is revolutionizing how health organizations serve patients. Can LLMs like ChatGPT help?

Scult says these tools can also help introduce new clients to foundational therapy basics, like the relationship between thoughts, emotions, and behaviors. “I often spend a fair amount of time in session introducing these concepts,” he says. With the time saved, he can ask specific questions about what tools a patient is using and the activities they engaged in that week. 

“Providers only have, typically, a 45 to 50-minute session per week,” Scult points out. “Most of people’s lives are happening outside of them. Especially those of us who are trained in the evidence-based approaches model, there’s a big emphasis on making sure you’re practicing and doing things that are aligned with what you’re working on in therapy outside of just those sessions.”

Therapists pour so much energy into helping their clients create long-lasting habits and changes, and better homework tools essentially streamline that effort.

Other AI tools like Limbic also focus on simplifying the onboarding process for new patients and self-referrals. By handling simpler admin and supporting providers in their assessments, these tools allow therapists to preserve emotional bandwidth. 

Patient reception

AI tools can give therapists their time and energy back. But how do patients react to them? 

HIPAA requires that patients provide written consent to have their sessions recorded by tools like Upheal. Morogiello says most of her clients have questions but are ultimately comfortable when they find out she uses Upheal. 

“Sometimes we’ll make jokes about it in session,” she says, adding that Upheal otherwise blends into her virtual sessions and looks like any other standard video conferencing interface. 

“I think most people, when they think AI, have a lot of mixed reactions to it,” Morogiello continues. She says her clients were most curious about the security of their data, but that they trust her to only use HIPAA-compliant tools with them. The counselor notes some of her higher-profile clients were a bit wary at first, and expects clients with conditions like OCD or paranoia would feel similarly. Overall, though, Upheal has been well-received.

Also: This smart mirror uses AI to boost your confidence and mood

Morogiello lets potential new clients know that she uses Upheal. She says she only had to pass on one potential client who was not comfortable with the idea; she referred them to a therapist who doesn’t use AI instead. 

By next year, she plans to integrate the tool across her entire workflow, including her couples counseling work. 

AI tools made by therapists

Several providers who spoke with ZDNET are also designing AI mental health tools of their own. In addition to running his practice, Scult is vice president of clinical science at Scenario, a wellness app designed to help users cope with everyday stressors — like first dates, conflicts, or interviews — using therapeutic techniques. In an effort to expand accessibility to mental health support, Scenario’s conversational AI can be used with or without the guidance of a provider.

Clay Cockrell, a New York City-based psychotherapist, is building an AI tool for couples interested in therapy. The model he’s creating can provide similarly structured advice and responses to what he already does. “In my work in marital counseling, so much of it is coaching-oriented — it’s teaching communication techniques and giving homework on how to improve intimacy. It’s not so much the inner work,” he explains, referring to the deeper reflection patients often do with a therapist. 

While this isn’t true of all styles of couples therapy, Clay’s approach lends itself to AI automation. Distilling that into a model can take on some of his would-be clients. 

Also: FDA approves first prescription-only app for depression

“I’m seeing this as more of an on-ramp to in-person couples therapy,” Clay says of his tool, which is not yet in beta. He hopes it will coax couples into more advanced counseling once they get comfortable with the idea. “Perhaps this would lead you to say, ‘We’ve gotten so far with this, now, maybe we need to move into in-person or live therapy situation.”

Cockrell also anticipates that the availability of AI-powered coaches like his will allow him to do more of the harder, more personalized work of therapy, especially if patients can use them on-demand rather than waiting for an opening in his schedule. 

These technologies are not to be compared to AI companions, which aren’t compliant with HIPAA regulations or trained in CBT. By contrast, the tools these therapists are building are trained on higher-quality, specified data and programmed with professionally set guardrails. 

Even so, Scult and Cockrell don’t go so far as to refer to the tools as therapists, instead describing them as counselors or coaches. For these therapists, it’s especially important to keep the distinction between formal therapy (which involves a human practitioner) and tools that make mental health resources more accessible.

And for good reason: Doing so could risk misrepresenting what therapy is. As the Data & Society report notes, digital options like DTC platforms can popularize the misconception “that therapy can be reduced/diluted to [any] forms of emotional support,” as opposed to an evolving process that builds on itself over time. 

Ultimately, these tools are as much for therapists themselves as they are for potential clients — they’re meant to help therapists democratize their skills without taking on every person in need, which can lead to burnout.

Downsides and roadblocks

Even with demonstrated benefits, no AI tool gets it right every time. While the therapists ZDNET spoke to had few complaints about the tools they use, they also acknowledged their limitations. AI still lacks context — perhaps its greatest flaw at the moment, but also what makes it unlikely to replace most jobs anytime soon. 

For example, when taking notes during a session with one of Morogiello’s patients, Upheal mistakenly identified the client’s son as their spouse. Morogiello was able to correct it upon review and report it to Upheal, which lets users provide feedback to improve its model. 

“For me, that downside does not overshadow the positive,” Morogiello says. “I’m able to be fully present with the client knowing that I have documentation going in the background.” 

Also: Anxiety-free social media? Maven thinks it has a formula for it

Another weakness is AI’s penchant for jumping to suggestions and advice quicker than a therapist might. Of course, this makes sense, given how we’ve primarily designed popular large language models (LLMs) to function as problem-solvers, search engines, and personal assistants that take commands. To correct this, Cockrell has had to focus his tool on learning how to show curiosity. 

“We created scenarios [in which] couples were having a hard time communicating, and she would give 10 lists immediately on how to improve their relationship,” he explains, referring to the chatbot as “she.” “I had to teach her a therapeutic approach. In my particular approach to therapy, I don’t talk a lot. I get you to speak, and the more you speak about your problem, the better you understand it. And then I know when to step in with a suggestion or a clarifying question.”

Cockrell hasn’t seen his bot offer any negative advice just yet, likely because of how controlled its training data is. But it’s certainly a possibility, especially for the less-than-clinically-trained bots out there. 

Given how narrow the scope of use currently is and how therapists are still very hands-on with the final product, providers are largely not concerned just yet. 

Scult noted that AI tools he’s encountered aren’t as customizable as he’d like for his patients, which can make them feel like proper therapy isn’t worth it. “Sometimes people are thinking: ‘If you’re just giving me another app, it may be less tailored to that unique experience with a therapist,'” he notes. 

He also has a smaller practice, so is less concerned with delegating certain tasks to AI tools at the moment. 

The future of AI in therapy

If adoption increases among providers, AI tools could change the nature of therapy. 

“My colleagues and I always joke that therapists would be the last job replaced by AI,” Morogiello says. She likens therapists using AI tools to doing math with a calculator. “It’s like having technology give you time and energy that you can focus on what’s uniquely human to you and your practice — things that, at least at this point in time, AI cannot replicate.” She envisions having an AI tool in the future that gives her live prompts and feedback during sessions to enhance her practice. 

Cockrell isn’t concerned that tools like the one he’s building could replace him. When asked how he’d react if he saw a tool like his come onto the market without context, he says he wouldn’t trust it. 

“There’s nothing that I do that could potentially ever be automated,” he explains. “You can’t just take a person and 20 years [of experience] and put them in a bottle.” 

Scult agrees that AI tools used thoughtfully and built with clinical expertise and ethical principles can be effective without replacing therapy altogether. “We’re not in a place where everyone can work with a therapist, so we need to think more creatively about other ways to improve people’s mental health and wellness.” 

Also: How AI hallucinations could help create life-saving antibiotics

If how people access therapy is changing to fit the digital age, tools explicitly for therapists need to evolve, too. In the current mental health landscape, even small support systems can supercharge providers otherwise at risk of burning out. Morogiello says she fully integrated Upheal with her practice for her wellbeing and workflow — it helps her business grow without the sacrifice of stretching herself too thin. 

“I’m able to see more clients,” she explains. “I’m able to be less burned out by the end of the week.” 

Morogiello may be indicative of a larger sea change. Just last month, Alma, a platform that helps independent mental health care providers run their practices, partnered with Upheal to bring gen AI progress notes to its EHR system. The tech enables therapists “to be more present in-sessions and save hours on progress notes that meet clinical best practices,” a release explains. 

Beyond big-picture goals like scalability, AI tools allow therapists to focus on the heart of their work: human connection. 

“I feel like I can actually make a larger impact on people’s lives more quickly, if I have a whole bunch of tools that I can recommend,” Scult says. 

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *