Mental health: Can AI-powered chatbots enhance access to care, combat stigma?
Tuesday, February 20, 2024
Clinicians, therapists, and researchers indicate that using an AI-powered chatbot can result in higher engagement and adherence compared to traditional methods. For instance, users of Wysa, Photo by Craish Bahizi

Clinicians, therapists, and researchers are increasingly recognising the potential of artificial intelligence (AI) as a powerful tool in mental health care provision.

Studies indicate that using an AI-powered chatbot can result in higher engagement and adherence compared to traditional methods. For instance, users of Wysa, an AI-powered mental health chatbot, have reported reductions in symptoms of anxiety and depression, with condition scores improving by an average of 31 per cent.

ALSO READ: Artificial intelligence to revolutionise healthcare systems — Minister Nsanzimana

AI is like a smart computer system that can do things humans do, and a chatbot is a type of AI that you can talk to, like a friend, to get information or help.

For example, Wysa allows users to engage in conversations with it, as it can understand people’s emotional needs and provide suitable responses. The app is also capable of alleviating symptoms while managing up to 80 per cent of the mental health support workload, thus allowing human support to be allocated to critical situations.

ALSO READ: Depression cases up by four fold - Ndera hospital

In Rwanda, the Rwanda Mental Health Survey conducted in 2018 found that 61.7 per cent of respondents were aware of mental health services, yet only 5.3 per cent reported using the services.

The study identified factors fuelling low utilisation of mental health services in the country: beliefs and misconceptions, leading people to seek help from religious and traditional healers rather than health facilities, as well as lack of accurate information, with many turning to unreliable sources.

The study team advocated for a comprehensive mental health approach and suggested tele-mental health as a potential solution, leveraging Rwanda’s ICT infrastructure —a strategy that aligns with the National Artificial Intelligence Policy's identification of healthcare as one of the potential sectors AI can help develop.

ALSO READ: Artificial intelligence to revolutionise health in lower-income countries

Using AI chatbots for mental healthcare in Rwanda has the potential to provide personalised interventions tailored to individual and culturally specific needs, as well as offer a safe environment for users to engage with mental health support.

One of the local tech start-ups, Osopox Ltd, whose solution, Bohokapp, emerged as the winner of the iAccelerator 4 competition organised by Imbuto Foundation, has begun exploring this possibility.

Bohokapp is a mobile application that addresses mental health by offering accessible and affordable information and treatment services, and Osopox team is in the process of developing an AI-powered chatbot to assist individuals in addressing mental health issues.

Gentil Rafiki, co-founder of the start-up, said that it seeks to use real-world examples and international techniques to offer helpful recommendations and connect users with psychologists if needed.

"As a learning model, it will learn user behaviours and interactions, suggesting articles based on their state and personalising responses as they progress,” he explained.

ALSO READ: Calls grow for more mental health services

Rafiki highlighted the chatbot’s potential role in providing a safe platform for expression and information access, particularly for healing psychological wounds among Rwandans as well as addressing the shortage of mental health professionals, which he said drives up costs and limits access to services.

Ethical considerations when employing AI chatbots

A World Health Organization report released in February 2023 shed light on challenges around using AI in mental health treatment and research. It found that there are still "significant gaps” in understanding how AI is applied in mental healthcare, flaws in how existing AI healthcare applications process data, and insufficient evaluation of the risks around bias.

Emmanuel Ndayisaba, an AI Engineer at Awesomity Lab, emphasised the importance of considering various factors when developing an AI chatbot for specific purposes like mental healthcare.

This includes identifying the target demographic, assessing the severity of the issues it will address, determining its diagnostic capabilities and potential for recommending prescriptions, and whether it will function as an assistant to human therapists, among other considerations.

Ndayisaba further stressed the need for regulatory approval and licensing for such chatbots to ensure legitimacy and adherence to established criteria.

"An individual or company should also be responsible for any mishaps, such as incorrect recommendations. Measures to mitigate that should also be in place,” he added.

Risk of relying solely on AI therapists

While AI therapeutic chatbots are still relatively new, some individuals have begun to depend solely on them for mental health support.

For instance, Blandine Iradukunda, a student based in Kigali, considers Snapchat’s AI chatbot "My AI” to be a valuable mental health advisor.

Speaking to The New Times, she described it as "the best”, as she confides in the tool about various issues in her life, whether they are related to school or personal matters. She said she finds solace in how it eases her burdens and contributes to her overall well-being.

Similarly, Annah Gaella Muteteli turns to the AI chatbot when feeling down and seeking answers. She appreciates the comforting and positive words it offers, which help lift her spirits, but also acknowledges that it’s not a substitute for severe conditions.

"What it does is provide automatic responses. It’s programmed, so it’s not suitable for people dealing with serious mental health conditions. They would require someone who can genuinely listen and provide guidance rather than just responding,” she said.

Balancing AI chatbots and human therapists

Anna Mapendo, a psychologist at Imanzi Counseling and Rehabilitation Centre, said that while AI therapy may offer some degree of assistance, it should not be relied upon as a complete solution.

"In therapy, non-verbal communication often plays a crucial role, which is something a human psychologist can perceive, but AI cannot. Human therapists can challenge individuals to reflect on their past experiences and provide a perspective on their struggles. AI therapists, on the other hand, lack this capability and can easily deceive,” she elaborated.

James Mugambe, a counsellor at Safe Place Organisation, acknowledges the potential of AI chatbots in dealing with stigma surrounding mental health, suggesting that anonymity and accessibility could be advantages, offering people an alternative to traditional avenues of support.

However, he emphasised that AI tools should not replace the fundamental importance of human connectedness and relationships, stating, "Humans should focus on fostering healthy connections and relationships, as these are inherent to our nature.”

Mugambe suggested that a holistic approach that combines both AI assistance and professional guidance can result in time-saving and enhanced overall mental healthcare service.