Could your computer become your therapist? As ChatGPT rises in popularity amongst digital natives, online consumers are constantly finding new ways to use generative AI within their daily life.
From simply answering math questions to generating grocery lists, ChatGPT has become something of a phenomenon in a modern-day household. The question is, could it now replace your therapist?
Social media is currently going crazy for the ChatGPT Counsellor, and have started presenting the AI chatbot with medical questions and even asking it for life advice. One TikToker even revealed that they had replaced their own therapist with ChatGPT in an effort to save money while still receiving sufficient support.
However, could the use of ChatGPT become a concern for health professionals?
“Be sceptical. AI chatbots are not meant to be used as a substitute for therapy, psychotherapy, or any kind of psychiatric intervention,” states Bruce Arnow, professor at the Department of Psychiatry at Stanford University. “They’re just not far enough along for that, and we don’t know if they’ll ever be.”
Stick with us as we jump into a tech-infused future of therapy and discuss whether ChatGPT could ever truly replace professional mental health treatments.
Could ChatGPT Be Your Therapist?
So what does a therapy session with an AI chatbot look like in 2023? For most users, they appear to simply message their chatbot with the concerns they’d traditionally relay to a therapist with the hope that generative AI would respond with helpful advice.
One of the most attractive qualities of an AI-powered therapist is the ability to talk to a machine rather than another human. For users that struggle to open up and share their feelings in the real world, ChatGPT could be an easy outlet and a safe space to be vulnerable.
The question is, how does ChatGPT respond when presented with more concerning questions, such as suicidal ideation? Dr Olivia Uwamahoro Williams, the co-chair of the American Counseling Association, tested just this in a study where she presented different generative AI bots with a series of difficult conversations.
“They all would generate very sound responses,” she concluded. “Including resources, national resources—so that was good to see. I was like, ‘Okay, well, these things are very accurate. The generated response is very counsellor-like, kind of therapist-esque.’”
The question is, could these responses unlock a new future for AI-powered therapy?
Unlocking AI-Powered Therapy Apps
AI has continued to transform the world around us. From AI-infused glasses that you can order online to AI-powered working in a post-covid corporate sector, the possibilities are endless for this smart piece of tech.
Therefore, it’s no surprise that AI has managed to seep its way into the healthcare sector, too, in 2023. In fact, some mental health professionals believe they could enhance the future of therapy in the form of counselling-inspired apps.
AI-powered therapy app, Wysa, is just one example of this. As a platform built by professional psychiatrists, the app communicates with patients using AI but has a fully functional script to read from, generated by expert insights and guided responses.
“There’s obviously a lot of literature around how AI chat is booming with the launch of ChatGPT, and so on, but I think it is important to highlight that Wysa is very domain-specific and built very carefully with clinical safety guardrails in mind,” claims Ramakant Vempati founder of the platform.
“And we don’t use generative text, we don’t use generative models. This is a constructed dialogue, so the script is pre-written and validated through a critical safety data set, which we have tested for user responses.” he continued.
While these AI-powered apps can’t replace traditional therapy, the founders of Wysa believe that it gives patients more opportunities to tap into conversion as and when required, rather than having to wait for a traditional one-slot appointment.
Do Practitioners Have Concerns?
As we step into the future of ‘on-the-go’ therapy, do clinicians still have concerns? While it’s been proved that chatbots generate seemingly appropriate responses to many mental health concerns, some experts believe that traditional forms of therapy are still a safer alternative.
“There’s not a person involved in this process. And so the first concern that I have is the liability,” says Dr Olivia Uwamahoro Williams. “There’s a lack of safety that we have to be open and honest about because if something happens, then who is held accountable?”
Traditional forms of therapy also enable both the patient and the therapist to form an emotional bond that generative AI cannot replicate. With a large amount of trust involved in being vulnerable, experts believe that a person will always find it easier to connect with and individual as a pose to a computerised alternative.
“I think in the future it’s going to probably surpass us—even therapists—in many measurable ways. But one thing it cannot do is be a human being,” claims Dr Russell Fulmer, Director of counselling at Husson University. “The therapeutic relationship is a really big factor. That accounts for a lot of the positive change that we see.”
The question is, could AI-infused therapy be used to simply enhance the conversation? Only time will tell.
Subscribe to our Newsletter
Stay up-to-date with the latest big data news.