AI therapy is about to make therapy a lot cheaper

Hands on laptop / Photo by Matthew Henry via Burst / Used under license“I’m in L.A. We have a lot of therapists,” Angelle Haney Gullett told the Washington Post in 2022. “So it’s just kind of wild to me that that many people are at capacity.” She had contacted 25 different therapists after her father passed away, knowing that she needed help. Even though she was willing to pay cash, not one would take her. No one would even put her on a waiting list.

She’s not alone. Tens of thousands of Americans struggle to access mental health care even when they know they need it, and even when their health insurance covers it. But for clients like Angelle, mental health care is about to get much easier to access. It’s about to get a lot less expensive, too. This will happen thanks to artificial intelligence. AI therapy is already here, and it’s about to upend US mental health care.

AI-powered chatbots already possess the skill and interactivity necessary to approximate therapy. They already engage in therapy, having conducted millions of sessions around the world. In other health fields, they already perform routine healthcare tasks. They already can respond in real time to cues in the client’s voice, displaying what looks and sounds like empathy. 

And they’ll never have a waiting list. 

=-=

AI therapists are about to have a major impact on the world of mental health care. Several companies, backed by millions in venture capital, are in a race to bring the technology to market. 

This is not a discussion about a distant, potential future. AI therapists are serving clients today. Clients, payors, and policymakers will all be eager to adopt the technology once it is widely available.

AI therapists are not able to duplicate all that human therapists do. In some ways, like providing immediate access to care, they’re better. In others, like exercising ethical judgment in their own work, they’re worse. But even when everyone involved is aware of the ways in which AI therapists are worse than human ones, research suggests many clients will choose an AI therapist anyway. Payors and policymakers will likely encourage them to do so.

AI therapists have already conducted millions of sessions

AI-powered therapy bots are poised to take over much of the day-to-day work that human therapists do. While chatbots cannot diagnose mental illness, they are fully ready to engage in brief, manualized interventions for common mental health concerns. (Consider, for example, instructing a client in challenging automatic thoughts in cognitive-behavioral therapy.) They can offer support and frontline interventions, shuttling clients to human therapists to address crisis or reporting issues.

The transition to AI-based care will happen faster than therapists would like to admit, or are ready for. Many of us comfort ourselves with the illusion that AI therapists are still in the realm of science fiction. But the technology not only already exists, it is already actively serving clients. 

Woebot is an AI-powered chat platform that delivers elements of cognitive-behavioral therapy under supervision of a health care professional. It is registered with the FDA as a medical device. (Specifically, it’s considered a breakthrough device, so it is not yet widely available. It is on a structured pathway to acceptance as a treatment that can be prescribed and supervised by a doctor.) The platform has already provided more than 77 million minutes of support. According to the company, usage spikes between 8 and 11 pm – times when individuals may be most in need of care, and least able to obtain it. The company is hoping to achieve initial clearance for the device to be widely used, with a prescription, for the treatment of postpartum depression. In the meantime, several randomized studies have found Woebot to be as effective as in-person interventions such as CBT skills groups.

Hippocratic AI’s virtual nurses provide patient monitoring, discharge planning, and other common nursing tasks. Their services are available for $9 an hour or less. The company has used this as a selling point, comparing their AI nurses with $90 an hour human ones. Noting the shortage of nurses in the workforce, they advertise AI professionals as a rapidly developing solution. Their nurses perform comparably to human ones, and for less than minimum wage. 

Those companies are working within the existing health care system, taking a cautious approach. But that’s not the only approach to AI therapy that companies can take. Others are building systems that work just outside health care and licensure regulations, advertising their services as “mental health support” and “coaching.” These companies are seeking to avoid regulatory hurdles in favor of more immediate, widespread access. Wysa, as one example, describes its services as mental health coaching, and as a first step before human-based care – if that human-based care ever becomes necessary. 

The company developing Wysa notes that within a week, the therapeutic relationship clients develop with their AI chatbots is as strong as the relationships built with human therapists. Wysa has had more than 500 million conversations, and delivered more than 2 million sessions of cognitive-behavioral therapy, according to the company’s web site.

The technology is ready. And there are obvious financial benefits to the companies involved if these efforts succeed. But the companies developing AI mental health care are not the only ones eager to make AI therapy open to all.

Payors, policymakers, and clients all want AI therapy

The interests of payors, policymakers, and clients are all aligned in favor of making AI therapy widely available. It is only the interests of human therapists that may not be well served. 

For clients, the benefits of widely available AI therapy are obvious and significant. Access is the most important. Thousands of clients like Haney Gullett can’t access mental health care today, potentially leaving them at greater risk for suicide or other negative outcomes. Some wind up paying out of pocket for therapists who don’t take their insurance. Others skip care altogether. 

AI therapists promise to solve the access problem, and to do it at low cost. Much of the country is considered a mental health care shortage area. AI therapists can serve thousands of clients at once, and can work across state borders, since they are not bound by state licensure requirements. In addition, AI can work in multiple languages, and can appear as whatever age, gender, and race would make the client most comfortable. 

Insurers will likely be similarly eager to launch AI therapy. At present, insurers struggle with network adequacy (in short, having enough therapists in-network to be able to serve the needs of the insurer’s customers). If they could make mental health care available to anyone in their plan, do it at $9 an hour, and scale up the use of the technology as needed to serve however many clients need it, they could argue to regulators that they have solved the network adequacy problem. 

That would represent a major win, and millions of dollars in cost savings, for insurers. In 2019, the total cost of US mental health care was estimated at $225 billion. While making AI therapy available would increase the number of people actually receiving care, if the AI service can be offered at one-tenth the cost of human therapy, even a doubling or tripling of utilization would still lead to significant cost savings for insurers. 

Policymakers similarly struggle with the access issue. They want therapy to be available to those who need it, but they recognize that they can’t just find more therapists. Scholarship and loan reimbursement programs aimed to make it easier for people to become therapists reach far too few to make a real difference, and often put burdensome requirements on the therapists. And neither policymakers nor professional groups have shown much appetite for the kinds of policy changes that would immediately increase the workforce, like eliminating clinical exams for mental health licensure or shortening education and experience requirements. 

Policymakers understand the public need for better access to care. They also know that state governments themselves are major payors for therapy, through public mental health systems. 

Allowing and even encouraging the use of AI therapy will allow policymakers to boast that they have done something about the problem, and that they saved taxpayers money while doing it. 

Businesses, policymakers, and clients are all eager for AI-powered mental health care to become more widely available. But is that service actually “therapy?” The debate isn’t just academic. It impacts legal issues like confidentiality and reporting, and the role of human therapists going forward. We’ll talk about that in the next article.

This article is part 1 of a 3-part series on artificial intelligence in mental health care. Check out Part 2 and Part 3.