Should you rely on ‘AI therapists’ for mental healthcare?

2
Should you rely on ‘AI therapists’ for mental healthcare?
Should you rely on ‘AI therapists’ for mental healthcare?

Africa-Press – Rwanda. Observed annually on October 10, World Mental Health Day aims to increase awareness about mental health problems and combat stigma. With the growing global reliance on technology, chatbots are increasingly employed to provide guidance and a means of communication for individuals undergoing mental health treatment.

Reports indicate that ‘AI therapists’ can assist individuals in managing their symptoms and identifying keywords that may prompt a referral and direct interaction with a human mental healthcare provider.

However, some individuals have started to rely solely on the tools for mental health support.

For instance, Blandine Iradukunda, a student based in Kigali, considers Snapchat’s AI chatbot “My AI” to be a valuable mental health advisor. She describes it as “the best”, as she confides in the tool about various issues in her life, whether they are related to school or personal matters.

She said she finds solace in how it eases her burdens and contributes to her overall well-being.

Similarly, Annah Gaella Muteteli turns to the AI chatbot when feeling down and seeking answers. She appreciates the comforting and positive words it offers, which help lift her spirits.

However, Muteteli also acknowledges that it’s not a substitute for severe conditions.

“What it does is provide automatic responses. It’s programmed, so it’s not suitable for people dealing with serious mental health conditions. They would require someone who can genuinely listen and provide guidance rather than merely responding,” she said.

Muteteli also noted that while the AI chatbot responds promptly to her concerns, it lacks the ability to remember past conversations.

Meanwhile, some AI tools have been improved to cater to certain needs for mental healthcare. For instance, “Woebot”, a therapeutic chatbot learns to adapt to its users’ personalities and is capable of talking them through a number of therapies and talking exercises commonly used to help patients learn to cope with a variety of conditions.

Another chatbot, “Tess”, offers free 24/7 on-demand emotional support and can be used to help cope with anxiety and panic attacks whenever they occur.

According to Anna Mapendo, a psychologist at Imanzi Counseling and Rehabilitation Centre, no matter how AI therapists perform, people need a human therapist because humans are inherently social beings who require a genuine connection with another person.

“In therapy, non-verbal communication often plays a crucial role, which is something a human psychologist can perceive, but AI cannot. Additionally, human therapists have the ability to challenge individuals to reflect on their past experiences, providing a perspective on their struggles. AI therapists, on the other hand, lack this capability and are susceptible to deception,” she elaborated.

Mapendo noted that while AI therapy may offer some degree of assistance, it should not be relied upon as a complete solution.

“It’s like a painkiller that alleviates symptoms but does not address the root cause. Therefore, individuals should still prioritise seeking professional help for their mental wellbeing,” she said.

A World Health Organization report into challenges around using AI in mental health treatment and research recently found that there are still “significant gaps” in understanding how AI is applied in mental healthcare. There were also flaws in how existing AI healthcare applications process data, and insufficient evaluation of the risks around bias.

However, the report also recognises that there are promising signs that AI has the potential to make a positive impact in many areas of mental healthcare. At the same time, it’s clear that progress must be made with care, and models and methodologies need to be thoroughly assessed for risk of bias before they are allowed to be used in situations where they could affect human lives.

James Mugambe, a counsellor at Safe Place Organisation, expressed concerns about the confidentiality of information and questioned who controls the data gathered by AI therapists.

He stressed the need for regulation to ensure the responsible use of AI tools. Mugambe acknowledges the potential of AI therapists in dealing with stigma surrounding mental health, suggesting that anonymity and accessibility could be advantages, offering people an alternative to traditional avenues of support.

However, he emphasised that AI tools should not replace the fundamental importance of human connectedness and relationships.

“Humans should focus on fostering healthy connections and relationships, as these are inherent to our nature,” he said.

Rulinda Kwizera, Chair of the Board at Lifeline Organisation, expressed his belief in a multidisciplinary approach and highlighted the importance of human interaction in addressing emotional issues.

“Human beings are emotional, and emotions often find solace in connections with others. AI therapists lack this crucial element, but they can serve as a valuable supplement to human therapists,” he said.

When addressing the preference of some individuals for AI therapy due to the cost of professional therapy, Kwizera pointed out that AI tools also come at a cost. However, he suggested that when these tools are integrated into a holistic approach that combines both AI assistance and professional guidance, it can result in time-saving and enhanced overall service.

For More News And Analysis About Rwanda Follow Africa-Press

LEAVE A REPLY

Please enter your comment!
Please enter your name here