Image from Unsplash

‘Hello! I’m your therapist!’: can AI help mental health?

With mental health services under significant strain, patients are increasingly looking for new options. Online treatments have already experienced a notable surge, with platforms such as BetterHelp seeing a huge rise in revenues. But could technology’s relationship with therapy go even further, courtesy of AI?

The use of artificial intelligence has rapidly increased in recent years, with the emergence of platforms such as ChatGPT forcing the issue onto the political agenda. It has caused several sectors, from academia to journalism, to adapt their working practices and make use of the new technology available, whilst also warding against potential risks. It now appears that the field of psychotherapy could be no different.

Character.ai is a highly successful artificial intelligence platform which allows users to create chatbots based on a range of personas, real or fictional. One bot which has received a lot of traffic is Psychologist. Created by a user named Blazeman98, it has attracted 78 million messages since first emerging a year ago, with Character.ai’s own figures suggesting that includes around 3.5 million every day.

Whilst many of the characters on the platform are far more innocuous, such as anime or computer game characters like Gojo Saturu, the Psychologist aims to simulate real-world psychotherapy support. It is one of almost 500 bots on Character.ai which purport to provide mental health advice to users.

The creator of the persona, 30-year-old New Zealander Sam Zaia, claims he had no high expectations for the character.

“I never intended for it to become popular, never intended for other people to use it as a tool”, he told the BBC’s Joe Tidy. “Then I started getting a lot of messages from people saying that they had been really positively affected by it and were utilising it as a source of comfort.”

[Character.ai] is not certified or scrutinised like standard clinical therapy

A “source of comfort” perhaps, but is it dangerous to market it as anything else? Whilst Zaia suggests he trained the bot using knowledge adapted from his psychology degree, it is not certified or scrutinised like standard clinical therapy.
 

One Reddit thread on Character.ai, entitled ‘You need therapy, not AI’, was keen to warn against over-reliance. “AI isn’t a replacement for human connection”, the user suggested, “and using it too much does more harm than good. You’re channelling all your energy into something that doesn’t exist.”
Others weren’t as convinced. One commented that they had found it “really challenging” to find a good therapist, and that “on the days where I really need to vent or just have someone listen, characters have been there”.

Another responded rather more frankly: “Having been to therapy, it helped me about as much as talking to Sonic the Hedgehog about my problems does.”

Could Zaia be on to something? It is not just the expense or difficulty of finding treatment, but also the relative convenience of the AI software.

“So many people who’ve messaged me say they access it when their thoughts get hard, like at 2am when they can’t really talk to any friends or a real therapist”, he commented to the BBC.

Whilst this Character.ai persona has not been led by industry professionals, the healthcare sector is moving to accept more AI in its practice.

AI-based apps are gaining greater ground in mental health services as ways of monitoring clients and treatment regimes.

AI-based apps are gaining greater ground in mental health services as ways of monitoring clients and treatment regimes.

This also includes chatbots like the one Josh has established, with a cognitive behavioural therapy-based platform named Woebot being contemplated within a wider service offer.

In 2023, an AI service called Limbic Access received a UK medical device certification from the government. This is the first time such an approval has been granted to a mental health chatbot. Several trusts have moved to use it to classify and triage patients. 

The organisation’s own data suggests it has been received positively: of a survey of 100,000 NHS patients over quality of care, 92% responded with the highest score.

Research analysis also suggests that there has been a 15% increase in referrals due to the new software. This increase is even higher amongst minority groups.

While it is a recent advancement, it seems likely that AI practices will become more and more ubiquitous within mental health services. “Hello, I’m your therapist!”: that very artificial way of greeting may soon not feel so unusual.

 

Comments (1)

  • That’s all great (no joke, the potential of AI is still mind-blowing for me), but will people be so excited about such a therapist? In any case, that still might be an option for the poor.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.