Rise of the robot therapists and the psychiatry apps vying to replace humans


via Rise of the robot therapists and the psychiatry apps vying to replace humans

As a society, we are increasingly waking up to the stresses and strains everyday life has on our mental health.

Research suggests that mental illness is no more prevalent than it was twenty years ago but that the stigma for recognising mental distress is dissipating. Be it anxiety, depression or dementia, British charity Mind says that one in four people will suffer from mental distress in any given year.

Seeking help, then, is increasingly less taboo, but there is still work to be done with mental health care still suffering from a lack of resources across the world.  This is perhaps why many are taking a different route to self-care during this mental health revolution: robot therapists.

This is not so much about you lying down on a couch while a C3PO lookalike grills you about your feelings. Instead, a new wave of psychiatric ‘chatbots’ are finding a place on people’s phones. ‘AI life coach’ app Wysa boasts 1.2m users while Woebot, a similar app backed by Stanford University, launched earlier this year and is already used by hundreds of thousands of people. Swedish medical company Flow is the latest to join the growing selection of AI therapists, with an app that offers users advice on sleeping better, self-care tips and meditation techniques.

“Different people will benefit from different treatments for mental health problems,” says Kerry McLeod, Head of Information Content at Mind. “Digital innovations like chatbots are increasingly popular and useful for people who feel unable to use other, more traditional, forms of talking therapies like counselling.”

Flow chatbot therapist
Swedish developed Flow is the latest chatbot therapy app and can be paired with a ‘brain stimulation’ headset (right) CREDIT: FLOW

On opening the app a chatbot may ask you questions about your day. What you are doing, how you are feeling or question you on things that you are grateful for. It will look to encourage positive thinking. When WoeBot asked me about my day and I said it was ‘okay’, the chirpy robot replied that sometimes we need ok days to make the others ‘shine brighter’.

Wysa, meanwhile, will ask you to list your goals but then break them down into achievable chunks. While the apps can feel like a breezy, if repetitive, conversation at times, they are designed to find ways to impart wisdom to encourage positive thinking.

The apps use the heavily-researched ‘cognitive behavioural therapy’ – which WoeBot says is a ‘structured, effective way to challenge how you’re thinking about things’ –  to build a profile of its ‘patient’ as they enter responses each day.

The apps then use machine learning to look for mood-affecting patterns that are ‘sometimes hard for humans to see’. Not only that, but these therapists live in your phone and – while they will check in with you at set times each day – are available 24/7.

It seems rather too good to be true. But there is a growing feeling among experts that these chatbots, while not able to replace a human doctor (a fact the apps are quick to establish themselves), are helpful for battling anxiety and depression.

In Flow’s case, the Malmo-based company is in talks with the NHS to make the app and an accompanying brain stimulation headset available on prescription.

Apps like Woebot will ‘check in’ with you and use machine learning to tailor its advice to your responses

“Sometimes it feels like it can’ get personal to you like a real therapist can,” reads one review of Wysa. “But I have found that it’s a great tool in thought control and awareness. It’s been helping me reframe my thinking to more positive thoughts.”

It is also suggested that people that are nervous, or simply unable, about talking to a human about their mental health issues may be more open when interacting with an app. “It’s comforting to know that I can spend five or ten minutes at any moment of the day to work through a problem.” says a Woebot user. “Not being reliant on another human being is really very freeing.”

“It seems less likely to carry bias than an in-person therapist,” says another.

Broadly experts agree that used in the right way, psychiatry apps can be a force for good. A study into ‘embodied AI’ by the Technical University of Munich in May, for instance, said that the applications have “enormous potential”.

“They can make treatment accessible to more people because they are not limited to specific times or locations,” it says. “In addition, some patients find it easier to interact with AI than with a human being.”

Wysa offers exercises for users to perform CREDIT: WYSA

However, the same study also expressed ethical concerns over the rise of AI chatbots, saying that there is an ‘urgent need for action on the part of governments, professional associations and researchers’ to draw up guidelines for the use of virtual psychiatrists. And that AI methods ‘cannot and must not be used’ as a cheaper substitute for treatment by human doctors.

“More research is needed to see the impact that using chatbots have,” says McLeod. “We wouldn’t advocate that chatbots should ever completely replace more traditional forms of face-to-face therapy.

“Any chatbot on the market should be thoroughly user tested, be properly responsive, use the right language and tone, and have appropriate accurate referral mechanisms for people in crisis. There needs to be clear regulatory responsibility around the use of chatbots to ensure they are safe for people using them, who may be vulnerable and in need of support.”

As it is, the current wave of mental health chatbots do have broad safeguards in place and all data in anonymous. If a user types ‘SOS’ into WoeBot, for example, or types messages that indicate self-harm or suicidal thoughts, the app will go into ‘Crisis Mode’. This then triggers the app to say it is ‘beyond what it can do’ before providing a list of advice and contact details such as the suicide prevention line.

It is in these situations where an emerging AI chatbot will naturally fall short. In such severe situations, an in-person therapist is legally obligated to intervene if a patient becomes a threat to themselves or others. This is not an area where a chatbot could, or should, become involved. Even if the technology improves to a point where it is capable of such intervention, the ethical guidelines that TUM and McLeod describe would need to be significantly robust in order to deal with life-or-death situations.

“Digital mental health technology is one way forward, especially for people who have more mild symptoms,” says Dame Professor Til Wykes of the National Institute for Health Research. “But any such interventions need to be based on evidence-based therapeutic principles, have transparent privacy principles and be able to spot when human intervention is vital for safety.”

So robot therapists may not be replacing human doctors any time soon. But if their focus on mood and mindfulness can help ease anxiety in an ever stressful world, their well-natured chirping may well have a place in the mental health revolution.