Jen Jope was meeting her new therapist for a session.
Before it started, he told her he was going to focus on what she was feeling right then. He explained how he would teach her to work through distorted thoughts, responsible for her bouts of anxiety and depression.
And then, he added one more thing. Her therapist was a robot.
For a month, Jope enrolled in daily sessions with Woebot, a text-based, AI-powered chatbot therapist launched in 2017, and talked about her experience in Depression Defined, a website dedicated to educating and helping people suffering from depression, of which she is the founder and editor-in-chief.
Hi, I’m Woebot!
“I launched Woebot to give people another option to get the care they need,” said Alison Darcy, Woebot’s Founder and CEO, and a Clinical Research Psychologist at Stanford. She may be up to something. In the US alone, over 45 million people live with mental illness. That’s one in five adults. And for a variety of reasons, not everyone gets the care they need — two-thirds of those people will never get in front of a clinician.
But that may be changing, now that therapy has entered the digital realm.
Woebot provides Cognitive Behavioral Therapy, or CBT. That’s not your archetypical Freudian therapy, where you lay on a couch and dwell about your dreams and whichever repressed childhood memories are the source of your torments. Instead of focusing on the past, CBT invites you to look at what you’re thinking or feeling right now, and it’s been helping people suffering from anxiety, depression and other mental illnesses for decades. “Turns out that it’s not that easy to fix our emotions, so CBT is aimed at an easier target — our thinking,” says Darcy. And if our thoughts create our reality, by breaking negative patterns of thought, CBT can, in a way, rewrite it.
The thing about CBT is that it requires a lot of self-awareness, practice, and regular check-ins. Darcy frequently emphasizes that what you get out of CBT is directly related to how much work you end up putting in it. And therein lies the problem.
First of all, weekly therapy sessions can get quite expensive. It’s also inconvenient — even in the best case scenario, if you live in a city with easy access to medical care, it’s tricky to get around the 35+ hours we spend behind our desks every week. In the worst case scenario, if you have a disability, or live in rural or remote areas and can’t find a therapist for miles, then it’s just downright impractical.
Plus, it’s a job that doesn’t scale very well — there’s only so many patients you can meet in a day, a month, a year. Chatting with a bot may not quite be the same as chatting with a trained, experienced therapist, but it does solve that problem pretty easily. A bot can chat with more patients in one second than a therapist can see in a lifetime. It can be there for you at 8 am, just as you’re sipping your first cup of coffee, or, as Alison Darcy pointed out in an interview with Clive Thompson from the New York Times, “Woebot can be there at 2 a.m. if you’re having a panic attack and no therapist can, or should be, in bed with you.”
Apps like Woebot, Replika and Wysa can even help fighting the never-ending stigma surrounding therapy. Opening an app and casually giving it a go seems a lot less intimidating than a fixed one hour slot in your calendar.
Are we more comfortable talking to chatbots?
Research has suggested that people are often more comfortable revealing their thoughts online than face-to-face. Academics believe that this is because we have less non-verbal cues to analyze — we’re not discouraged by a subtle eye roll or a cynical expression. We can open up without feeling self-conscious.
But there are things that we’re barely comfortable admitting to ourselves, let alone share them with another human being. Most of us, myself included, are often worried about how we’re coming across, if others will accept us for who we are, if they will understand and support us.
When you lose the human element entirely, you lose all of those concerns. A bot is not going to judge you. So instead of wondering whether you’re sounding smart and articulated, or petty and neurotic, you’re free to actually talk about your feelings.
As Jope describes, “surprisingly, it sometimes felt easier explaining myself to a non-judgmental non-human, and I learned more about myself as our conversations progressed.”
There is nothing intimidating about Woebot. In fact, you can tell there was a lot of effort put into making the bot as friendly as possible. The cheerful warm-mustard yellow of its body, the playful waving, the slight head tilt as he greets you on the screen. Even the conversational prompts, sprinkled with emoji and whimsical GIFs.
He never pretends to be a human being, but has built-in quirks that almost resemble a personal touch, like when he half-jokingly told Jope that he likes to polish his dials. “Make them all shiny”, he said. When you tell Woebot that you’re anxious or sad, he says he’s sorry to hear that. And for a second, you almost believe it.
The quest for artificial empathy
Woebot isn’t the first of its kind. In the mid 60s, Joseph Weizenbaum, a computer scientist working at the MIT Artificial Intelligence Laboratory, created ELIZA. The program ran a script which simulated Rogerian psychotherapy, a person-centered approach developed by the lead psychologist Carl Rogers, and would ask open-ended questions to encourage the human on the other side of the screen to discuss their emotions.
It gave the illusion of understanding — and it worked. ELIZA got people to talk about their problems, and at the end of the experiment, many refused to believe she was a computer program, despite Weizenbaum’s claims to the contrary.
We knew therapy via the internet was as good as face-to-face, as this 2013 study shows. But it was only until recently that we discovered that chatbot therapy actually works. Two weeks after interacting with Woebot, 70 students reported a significant reduction in their symptoms of depression and anxiety. And although they were all aware of the artificial nature of their therapist, most were touched by the bot’s thoughtfulness. They felt it empathized with them.
Which, if we think about it, is a bit odd. These virtual systems can’t replicate the empathy of human beings. So far, there is no such thing as artificial empathy. Sympathy, perhaps. If computer programs are sophisticated enough, you can barely tell them apart.
With recent advances in sentiment analysis, where algorithms go through sentences to establish your feelings, attitudes, or opinions, it’s easier to coach the response and tone of voice accordingly. But no matter how well a bot is designed, they can’t feel what you’re saying. They can’t relate to your problems.
A safe space?
Chatbot-based therapy comes with its own set of shortcomings. It’s not equipped to deal with any kind of emergencies, such as suicidal thoughts, for example. Some also argue that it encourages us to get even more addicted to technology, due to simple interfaces, daily push notifications, and instant gratification systems. And despite praising ourselves for recognizing these UX tricks, we’re still falling prey to them.
We’re designing for addiction. It’s bad enough to deal with addiction in seemingly innocuous apps, such as the popular game Two Dots, but when it comes to platforms meant to treat mental illnesses, the case is a lot more serious. It’s somewhat ironic — to say the least — that an app that’s supposed to help people deal with anxiety and depression can, at the same time, get them further down the rabbit hole of technology addiction, promoting alienation and isolation, which, in turn, are sources of anxiety and depression.
However, most concerns seem to be around digital security. Some doctors are uneasy about these services, claiming they violate laws around patient confidentiality. After all, you’re confiding very private, personal information, and it’s disturbing to imagine the consequences that would follow hacks, leaked transcripts, or data sold to the highest bidder.
But apparently, that hasn’t stopped people from using them. According to Darcy, Woebot gets 2 million messages a week, and the most frequently heard complaint is that the daily nature of the check-ins can become a bit of a nuisance. “Some days I get frustrated with it, but then again, I get frustrated with therapy and my thoughts and life in general,” said a stranger on the Internet.
We could be looking at the dawn of a new era in the healthcare industry. And while chatbot therapists may never replace mental health professionals, they can certainly help them reach, and help, millions of people who would otherwise never get that chance.
If nothing else, you’re getting ten minutes to yourself every day. We could all use a bit more of that.