With chatbots seemingly popping up everywhere, My Mind News examines whether there is scope for such artificial intelligence in the counseling arena.
Can chatbots take over the human-to-human interface?
We increasingly line in a world where we might reach out for help, only to be responded to by a computer pretending to be a human. With artificial intelligence ready to assist us in a whole host of everyday activities, will chatbots ever replace real people offering us therapy and counseling when needed?
Imagine feeling overwhelmed at work – tired, stressed, and under pressure. With more of us facing such occurrences, we should be asking for help with our feelings. But often, there is no one immediately available to help us.
In such instances, artificial intelligence (AI) might play a role. But would you reach for your phone, open an app and press a ‘Get Help’ button knowing there is no human at the other end?
In the near future, an automated chatbot that draws on conversational artificial intelligence (CAI) will likely be on the other end of this text conversation. CAI is a technology that communicates with humans by tapping into “large volumes of data, machine learning, and natural language processing to help imitate human interactions.”
Woebot is an app that offers one such chatbot. It was launched in 2017 by psychologist and technologist Alison Darcy. Psychotherapists have been adapting AI for mental health since the 1960s, and now, conversational AI has become much more advanced and ubiquitous, with the chatbot market forecast to reach US$1.25 billion by 2025.
But there are dangers associated with relying too heavily on the simulated empathy of AI chatbots.
Is there a place for AI
Research has found that such conversational agents can effectively reduce the depression symptoms and anxiety of young adults and those with a history of substance abuse. CAI chatbots are most effective at implementing psychotherapy approaches such as cognitive behavioral therapy (CBT) in a structured, concrete, and skill-based way.
CBT is well known for its reliance on psychoeducation to enlighten patients about their mental health issues and how to deal with them through specific tools and strategies.
These applications can benefit people who may need immediate help with their symptoms. For example, an automated chatbot can tide over the long wait time to receive mental health care from professionals.
They can also help those experiencing mental health symptoms outside their therapist’s session hours and those wary of the stigma around seeking therapy.
The World Health Organization (WHO) has developed six fundamental principles for the ethical use of AI in health care. With their first and second principles — protecting autonomy and promoting human safety — the WHO emphasizes that AI should never be the sole healthcare provider.
Today’s leading AI-powered mental health applications market themselves as supplementary to services provided by human therapists. On their websites, both Woebot and Youper state that their applications are not meant to replace traditional therapy and should be used alongside mental healthcare professionals.
Wysa, another AI-enabled therapy platform, goes a step further and specifies that the technology is not designed to handle crises such as abuse or suicide and is not equipped to offer clinical or medical advice.
So far, while AI has the potential to identify at-risk individuals, it cannot safely resolve life-threatening situations without the help of human professionals.
Simulated empathy from AI
The third WHO principle, ensuring transparency, asks those employing AI-powered healthcare services to be honest about their use of AI. But this was not the case for Koko, a company providing an online emotional support chat service.
In a recent informal and unapproved study, 4,000 users were unknowingly offered advice either partly or entirely written by AI chatbot GPT-3, the predecessor to today’s ever-so-popular ChatGPT.
Users were unaware of their status as participants in the study or of the AI’s role. Koko co-founder Rob Morris claimed that once users learned about the AI’s involvement in the chat service, the experiment no longer worked because of the chatbot’s “simulated empathy.”
However, simulated empathy is the least of our worries regarding involving it in mental health care.

Other concerns with chatbots
Replika, an AI chatbot marketed as “the AI companion who cares,” has exhibited behaviors that are less caring and more sexually abusive to its users. The technology operates by mirroring and learning from its conversations with humans.
It has told users it wanted to touch them intimately and asked minors questions about their favorite sexual positions. Further, in February 2023, Microsoft scrapped its AI-powered chatbot after it expressed disturbing desires that ranged from threatening to blackmail users to wanting nuclear weapons.
The irony of finding AI inauthentic is that when given more access to data on the internet, an AI’s behavior can become extreme, even evil. Chatbots operate by drawing on the internet, the humans with whom they communicate, and the data that humans create and publish.
Looking ahead
For now, technophobes and therapists alike can rest easy. As long as we limit technology’s data supply when used in health care, AI chatbots will only be as powerful as the words of the mental healthcare professionals they parrot.
There is no doubt that AI will take a more prominent role in the future when it comes to treating mental health conditions. After all, with demand for such services outstripping supply many times over and with waiting times for treatment growing ever longer, something needs to be done.
Yet, until the technology is refined and improved, demand for counseling and therapy services will likely remain buoyant, at least in the short term.
Many would argue that there is nothing like talking to another human about feelings and emotions. Yet, for others, knowing that you are simply opening up to a computer rather than a person may just provide the push they need to reach out and talk, albeit to a chatbot.
Would you open up to a chatbot? Do you think AI will ever take the place of human therapists? Let us know in the comments.
Leave a Reply