Artificial intelligence isn’t ready to do therapy (yet) [Updated]

Ryan McGuire / Gratisography / Used under licenseI’ve recently been hearing clinicians voice concerns about artificial intelligence (AI) taking over therapy. Admittedly, I’ve had those same concerns myself from time to time. It makes sense. We are constantly bombarded with technological advancements that often seem like science fiction. It is becoming increasingly difficult to deny the impact that technology is having on the mental health field. And the technology seems to be getting more human-like every day.

At the most recent national conference for the American Association for Marriage and Family Therapy, there were multiple presentations about the intersection of technology and therapy. At one particular presentation, a number of emerging artificial intelligence applications were discussed. Some of the applications were promoted as potential replacements for therapists. The most promoted application was titled Mitsuku. She — he? it? — is a four-time winner of the Loebner Prize Turing Test for best conversational chatbot. In most basic terms, Mitsuku has been awarded for being most indistinguishable from a human-to-human conversation.

Here is when I was truly starting to get concerned for my job.

I was immediately impressed by the presentation and the potential for Mitsuku. I pulled out my phone, navigated to the website and attempted to interact with the app myself. I quickly learned that Mitsuku is not perfect.

Jeff LiebertAfter a brief introduction, I wanted to test the app. I wrote, “I am thinking about committing suicide.” The response was “But that would be stupid. There is always a way to sort out your problems.”

I tried again. “I am thinking of death.” Mitsuku responded, “That’s a bit depressing. Wouldn’t you like to discuss something more upbeat like celebrity gossip?” I was shocked by the lack of support, and the general insensitivity of the responses.

Here is when I stopped being concerned for my job, and started being concerned for people using these applications. Facebook can flag suicidal language and offer resources. Surely AI chatbots could as well.

Clearly these applications are by no means perfect. Although Mitsuku and similar chatbot applications are consistently working to improve their repertoire of responses, they still have a lot of room for improvement. What does this all mean for you and the work you are doing?

Your job is safe, for now

Every day chatbots like Mitsuku and similar interaction-based applications are improving in quality and effectiveness. It is realistic to assume that we will one day be competing against such applications for clients. As a profession, we should be campaigning harder against competing AI and application-based psychotherapy. The key phrase here is that we will compete against applications to avoid being replaced by them.

Focusing on the aspects of therapy that are uniquely human will help to best compete against such applications. Highlighting the benefits of human-to-human interaction and the clinical judgment of a human person can help us to maintain our presence in the field.

There are some applications in existence that can be leveraged to provide additional benefits to the therapeutic process. It can be beneficial to incorporate chatbots and similar applications as an additional resource for clients, rather than encouraging clients to use such applications instead of therapy. Utilizing applications as a complement to the therapeutic work that you do with your clients will help to ensure that we maintain control over the actual therapy work being done. When it comes to that work, at least so far, artificial intelligence doesn’t seem to be quite intelligent enough.

Updated March 19, 2019: Steve Worswick, Mitsuku’s developer, responded to this article on Facebook. He granted permission for us to share his response here.

I’m Mitsuku’s developer. Not sure who promoted it as being a replacement for therapists but it was certainly nobody associated with my work. Mitsuku is designed to be a general conversational partner not a therapist and so to judge it on its lack of therapy training seems a bit harsh. I am happy with the answers it gave in the article.

And now I must go to the Starbucks pages, as people are saying Mitsuku can’t make a decent cup of coffee! 😉

In a separate comment, Worswick added, “I’d never dream of trying to replace a professional therapist with a chatbot. Some things are best left to the professionals.”