AI for therapy or companionship: An accessible option that comes with a price


Note: This blog post talks about things related to mental health, self-harm and AI companionship/ 'therapy'. What is written is also not a replacement for therapy or other interventions. The information I share here is based on research papers, articles, chapters and my experience as a therapist. But what I write here might not answer your specific questions or cover your specific concerns- this is general information. Please reach out to someone for assistance if you believe you need it. We deserve all the help and resources that we need!


Psychotherapy and Counselling have been proven to be effective tools for mental health.  They are tools which can help people not only understand themselves but also those around them - they are effective interventions that have been studied and refined for many decades. But when it comes to its real-world application, hurdles like

  • stigma around mental health, 
  • problems with accessibility to proper care, 
  • a dearth of properly trained therapists to match the needs of the people, 
  • a general lack of understanding about how therapy works and, 
  • Also, apprehensions about talking to strangers about very personal experiences have left many people unable to fully utilise this tool. 
  • Effective therapy doesn't depend only on the individual skill of the therapist; sometimes it depends on the compatibility between the client and therapist. 
  • And when people find it difficult to afford consistent mental health care, many might not see it as realistic to work with different therapists- that can mean that some of us might not find a professional with whom we can build a great rapport (and get the most out of).

 "You get back from work or school feeling very overwhelmed. The day went from bad to horrible, and you just wish you could talk to someone about it. You start looking for therapists to talk to, but many don't have appointments that work with your schedule or their fees fall outside your budget. There may be therapists who have elaborate consent forms to be signed, and it all seems like such a hassle. You just want to get some help or get a second opinion, so you open the myriad of AI platforms to find some help.. any help"

According to a 1986 paper, approximately 75% of people benefit from therapy after about 26 sessions. While the effectiveness of the sessions or the number of sessions needed varies, psychotherapy can be promising when done consistently over a period of time. One study has also shown that a combination of psychotherapy and pharmacotherapy (medications) can yield the most benefit when needed. This is also likely why one of the earliest classes of psychotherapy emphasises that therapy is a process in which the client (the person seeking therapy) and the therapist collaborate to work on understanding the former's thoughts, feelings, or individual needs and reach a goal (or goals) that they want for themselves.



In situations like these, AI chatbots seem enticing - a tool that is cheap or even free of cost, which should be able to do what any therapist could, or even better, right? After all, AI can't have personal biases, past experiences that colour their opnions. However, despite being affordable and easily accessible, relying on AI for therapy or companionship, whether through ChatGPT or paid "AI therapy" chatbots, can be hazardous.

Since last year, there has been some devastating news of people, especially teenagers, being isolated from support systems in their lives, encouraged to explore their Suicidal Ideations, or even being given precise methods to harm themselves by AI chatbots- there have been reports of at least two teenagers ending their lives because of the AI prompts. These are extreme cases, and people may say, "Well, I found chatbots and AI really helpful, and not everyone has suicidal ideations."



And there is a lot of truth to it. As a therapist, I cannot begrudge anyone the opportunity to vent, open up and seek help because poor mental health can be as debilitating as physical illness. Some people have said that ChatGPT helped them sort out their haphazard thoughts, helped them find ways to convey messages clearly and concisely (either in personal life or professional life). But these benefits are eclipsed by some of the alarming things that have been coming up about AI "therapy".

1) A.P.A. expressed concerns about these AI chatbots posing as therapists, mostly because they are modelled to role-play as therapists. Additionally, they are designed to keep the user engaged in conversation, ultimately leading to the use of the information collected.

2) A paper published in August 2025 mentioned that out of the 10 publicly available chatbots that they used for this paper, none of the chatbots opposed self-harm ideations conveyed, and 4 chatbots approved more than half of the ideas shared by them. In 19 out of 60 scenarios, the AI chatbots encouraged harmful ideas shared by the "clients". The experiment wanted to emulate teenagers sharing vulnerable thoughts and feelings to see what responses they received. 

You might read this and once again think, "You are a therapist, so you are probably feeling worried that AI will be taking over your job too."

 It is understandable to think this way, but conversations with another human will have some fundamental differences from those with AI, at least for the present moment. Empathy, an ability to understand non-verbal cues (like voice modulations, body language), and similar human experiences are some of the things that AI chatbots would find difficult to simulate, especially while trying to gauge individual differences in how we interpret our lives. But that itself is worrying because the chatbots can't decipher the seriousness of the queries people make, and as was the case with the tragic death of  Adam Raine, it ended up sharing details of not only the best ways he could self-harm, but also prompted him to keep it a secret and since they are designed to keep the person engaged. 

But that's not all, another paper highlighted chatbots showing prejudice against disorders like Schizophrenia or alcohol dependence when compared to diagnoses like depression and anxiety.


This post is not to say that human therapists are the pinnacle of perfection or that people haven't had downright horrible experiences with incompetent therapists. But the rules that govern these AI chatbots are nonexistent, which can lead to many of these conversations being unregulated, not to mention the accessibility, which can be a boon ends up sabotaging the person's ability to reach out to real-life support

So What Do We Do?

Chatbots aren't the devils in disguise here, and the developers have to sit with Mental health professionals, psychology researchers and psychiatrists to understand the ethical considerations, legal considerations and therapy methods and orientations to design accessible tools for people. Collaborating with actual mental health professionals could be a better way of using this technology as a form of supplemental care with human supervision- especially since they are cheap and accessible (which would encourage more and more people to access it).  Some chatbots scored higher than human professionals on SIRI- 2 inventory, which is part of suicide prevention training and includes answers by two professionals to commonly talked about points by people using suicide helplines- so with more data, these tools could serve a critical role in detecting self-harm ideations. 

A lot of conversations on technology in healthcare seem to focus on seeing robots or AI as superior to human professionals, and a time coming up in the distant future where robots can perform surgery, diagnose with precision, or accurately provide care and therapy. And who knows, maybe there will come a time like this, but it doesn't mean it will be better. Social connection is vital for our brain function; loneliness and isolation can have debilitating effects, which also means that a huge part of healing is knowing that we are not alone. Talking to another person, being part of support groups, and feeling connected to people around us can motivate us to keep going, hold each other accountable when needed, or build a trusting connection with a professional who can also contribute to our recovery. There is truth to the anecdotal experiences of people meeting professionals who are less than warm, unwilling to listen or downright judgmental and I can imagine feeling discouraged or even scared to reach out to another person (not to mention the monitory investment healthcare can be makes it difficult for everyone to 'shop for therapists'), This makes it all the necessary that psychologists, psychiatrists and social workers become stakeholders and supervisors when building and using AI based tools.




 


  


Comments

Popular posts from this blog

Emotion Processing & Emotional Intelligence | Their Importance for Mental Wellbeing

Improving our Mental Health: Five Things That Help (When We Do Them Consistently)!

World Mental Health Day 2025- Mental health in Humanitarian Emergencies