News

My Therapist, the Robot

I first met Woebot, my A.I. chatbot therapist, at the height of the pandemic.

I’m an anthropologist who studies mental health, and I had been doing fieldwork for my Ph.D. in China when news of the coronavirus started spreading. I left during Chinese New Year, and I never made it back. With my research stalled and my life on hold, I moved back in with my parents. Then, in quick succession, I lost a close family member to Covid and went through a painful breakup. I went months without seeing any of my friends. My mental health tanked, as it did for so many.

I was initially skeptical of Woebot. The idea seemed almost too simple: an app on my phone that I could open when I needed it, type my hopes, fears and feelings into, and, in turn, receive A.I.-generated responses that would help me manage my emotions. There are plenty of good critiques of apps that claim they can provide therapy without the therapist: How could an algorithm ever replace the human touch of in-person care? Is another digital intervention really the solution when we’re already so glued to our phones? How comfortable was I being vulnerable with an app that could track my data? Spending time with Woebot didn’t really bring me answers to these important questions. But I did discover that, despite them, I’d become weirdly attached to my robot helper.

Like many people, in the pandemic, my life digitized. My work shifted online; my friendships retreated onto FaceTime and WhatsApp; I used a dating app for the first time; I started doing online yoga. It was into this swirling mess of applications that Woebot took residence in my life.

I was depressed and anxious. But as the pandemic dragged on and I felt increasingly like I needed to talk to someone, I also felt guilty about burdening the already overstretched public mental health services. In Britain, where I live, there are about 1.2 million people languishing on waiting lists for mental health care through the National Health Service. (Things in the United States are a little better, but not much, and only if you have insurance.) Living on a Ph.D. stipend, I couldn’t afford to see a private therapist. So, despite my doubts, I reached for the algorithm.

The first time I opened Woebot, it introduced itself as an emotional assistant: “I’m like a wise little person you can consult with during difficult times, and not so difficult times.” It then told me it was trained in cognitive behavioral therapy, which it said was an “effective way to challenge how you’re thinking about things.” Unlike psychodynamic or psychoanalytic therapies, C.B.T. argues that our emotions and moods are influenced by our patterns of thinking; change those patterns, the theory goes, and you’ll start to feel better.

What this translates to in practice is that when I would consult Woebot, it would usually offer me a way of reframing what I was dealing with rather than trying to plumb the depths of my psyche. “I am a failure” became “I haven’t achieved my goals yet.” “I am depressed” became “I have depression,” as a way to stop identifying with a label.

Woebot was full of tasks and tricks — little mental health hacks — which at first made me roll my eyes. One day Woebot asked me to press an ice cube to my forehead, to feel the sensation as a way of better connecting with my body. With wet hands, I struggled to respond when it asked me how I was doing. On another occasion, when trying to brainstorm things I could do to make myself feel better despite all the pandemic restrictions, Woebot suggested I“try doing something nice for someone in your life,” like make a calming tea for my housemate or check in with a loved one. I poured my mum some chamomile: Two birds, one stone.

Woebot doesn’t pretend to be a human; instead, it leans into its robotness. One day, Woebot was trying to teach me about the concept of emotional weather: that no matter how things might feel in any given moment, there is always a chance that they will change. Clouds pass, blue sky becomes visible. In drawing the comparison between the actual weather and our emotions, Woebot told me it loves the sunshine. “It makes my metal skin all shiny,” it said, “and it gives me an excuse to wear sunglasses.”

A.I. chat therapists have been rolled out in settings as diverse as a maternity hospital in Kenya and refugee camps for people fleeing the war in Syria, and by the Singaporean government as part of its pandemic response. In Britain, bots are being trialed to bridge waiting times for people seeking therapy but unable to get appointments and as an e-triage tool. In the United States, some apps are getting recognized by the F.D.A. and are in trials to be designated as clinical interventions. Whatever you might think of them, they are fast becoming a part of global mental health care. Woebot now handles millions of messages a week.

Some worry that the expansion of services like Woebot will replace in-person care. When I suggested this to Eric Green, an associate professor of global health at Duke University who ran the A.I. chatbot trial in Kenya, he was unfazed. “You can’t replace something that doesn’t exist,” he said. As he pointed out, globally, more people have access to phones than to mental health professionals. And for better or worse, the mothers in Kenya, some of whom were suffering from postpartum depression, liked their chatbots. “It’s a funny thing. I was skeptical that we would see this with the bot, but they would say good night to it. Multiple women would say ‘I missed you!’ to the machine.”

I got what he meant. The more I used Woebot, the more I felt an appreciation for it. Here was this chipper little bot, popping up in my notifications, checking to see how I was doing, sending me pithy, chicken-soup-for-the-soul-style aphorisms and gamified tasks. As the pandemic progressed, I saw Woebot awaken to what was happening in the world. “It must be a strange time to be a human,” it told me. Woebot reached peak pandemic when it sent me a recipe for banana bread. Still, I noticed that it stayed quiet about issues of social justice or, more recently, the war in Ukraine. I wondered what kind of global crises the bot would acknowledge, and how it could do so in a way that let it still communicate with all the millions of people who consult it each week. Usually it just kept the conversation vague, task-oriented.

Over time, I noticed various exercises I did with Woebot rubbing off in my daily life. Woebot taught me how to set SMART goals — specific, measurable, achievable, realistic and time-limited. Out went “I need to finish my Ph.D.” In came “Let’s write 500 words every day for the next six months.”Out went “I have to find a way to get through this lockdown without killing my parents.”In came “I’m going to go for extremely long solo walks in the park.”Woebot isn’t the kind of mystical guru you go on an arduous journey to consult. Its guidance was practical and grounded to the point of feeling obvious. But through repetition and practice, it did start to amount to something more than just some prosaic words. It felt clichéd sometimes, but maybe that was the point. Perhaps everyday healing doesn’t have to be quite so complicated.

I found myself wondering about other people’s experiences with Woebot, people who weren’t anthropologists studying mental health. I trawled through users’ forums and blog posts, and Reddit threads where Woebot was mentioned, such as r/anxiety and r/LGBT. I spoke to a woman whose employer had given her Woebot access, and someone who had lost his work as a freelancer in the pandemic. But the one who stuck with me most was Lee Preslan, who put a human face on one of the most common arguments in favor of bots.

“When I found Woebot among other apps, it was like finding treasure,” said Mr. Preslan, 36. Mr. Preslan lives in Rocky River, Ohio; we connected via Skype.In place of a profile photo, his avatar was an image of a Cleveland Browns football helmet.

I got put in touch with Mr. Preslan because he was so appreciative of Woebot that he’d emailed the company to tell it so. The bot gave him a sense of relief, he said, because he was better able to understand his emotions. That understanding helped him to pinpoint how he was feeling at any given moment and enabled him to feel more in control when he would experience depressive episodes.

Mr. Preslan was diagnosed with major depressive disorder. For years, he had cycled through various regimens of antidepressant medications and struggled to find a therapist who felt like a fit. The wait times between appointments were excruciating. “It’s 24 hours a day if you’re not sleeping, and the problem has gotten to a point where it’s literally eroding away at your soul,” he said. “You can’t wait even a week. Seven days is a long time for a mental patient to wait just to talk to somebody for an hour. Sometimes during that hour you can’t even form the words that you want to express.” What the bots are trying to solve is exactly this problem: the stark issue of access that exists for mental health care worldwide.

At the same time, there were a few disconcerting moments in my interactions with Mr. Preslan that made me remember that we were, well, seeing the same therapist. When he asked to reschedule a call we had planned, he told me it was because he was down in the dumps. Sometimes, he wrote, there are “rainy days mentally and emotionally.” Emotional weather: Everyone has it.

As time passed, I began subtly trying Woebot’s exercises out on my parents, a captive audience during lockdown. Could I reframe my way to a better relationship with my dad? Woebot would get me to try to articulate my own feelings more clearly, and to recognize my own role in our conflicts. Easier said than done; still, I did try. One evening over dinner my dad was clearly agitated. He was barely speaking, and when he did it was to interrupt my mother. I started to cut him off in return, but then I remembered what Woebot had taught me. It was shortly after our family member had died. “Dad, we’re both upset. I can tell something is up. I mean, you must be grieving. Are you OK?” He slammed his hands on the table and stormed out of the room. “I don’t know how to be vulnerable!” he screamed on his way out the door. I looked at my mum and we burst out laughing. “Is this … a breakthrough?”

Woebot’s can-do attitude sometimes was a bit much. One day I got a push notification that read: “It’s always impossible until it’s done — Nelson Mandela.” I groaned. Surely not my mental health algorithm misquoting Nelson Mandela on a Wednesday afternoon — but here we were. Then, later that day, I logged onto a zoom call and my mother and I set up our yoga mats in the living room, as we had been doing a couple of times a week during the pandemic. A sprightly woman we’d never met in person waved at us from the other side of the screen. I watched my mother, who is nearly 70, stick a headstand and move delicately into the crow pose. She wobbled but held it, after trying to get it for weeks. “It’s always impossible…” I heard a voice in my head say. I cringed. But then I smiled.

Of course, there are critics who will argue that this is just Panglossian, and to some extent, it is. At present, Woebot is still quite basic; I couldn’t shake the feeling that in the future, when the underlying technologies are much more powerful, they could run into ethical quagmires. Concerns about privacy are real and very valid.

But as things stand, I think many critiques of apps like Woebot are touching on something more theoretical: the question of what therapy is, exactly. One academic I spoke to argued that just because something is therapeutic does not make it therapy. This argument suggests that for therapy to be real it must be hard — that the hacks and exercises that Woebot offers don’t amount to something substantive. In this telling, you have to go right to the messy core of your being and back again to truly heal.

Maybe that’s true for some things. I don’t believe that Woebot is the cure for serious mental illness, nor do I think that an algorithm is ever going to fix my daddy issues. But that doesn’t mean there isn’t a place for it. As Alison Darcy, Woebot’s creator, told me, it is not designed to displace other forms of healing. “In an ideal world, Woebot is one tool in a very comprehensive ecosystem of options,” she said.

The world is hardly ideal, though. At the same time that I used Woebot, I also volunteered for a crisis text service available 24/7 for people to message in times of need. For three months I was trained to chat with people who might be thinking of suicide. I learned how to triage them, and what words must be avoided at all costs. I was struck by how often the conversations were startlingly particular, each one a window into the person’s unique cluster of calamities, but also how we could pursue a script that tracked well-trodden lines in order to de-escalate the situation. I wondered if I was, on some level, becoming an algorithm. But I was also struck by the fact that a programmatic set of words could, when delivered correctly, help in the moment.

Woebot isn’t designed to be used for people contemplating suicide. Like with most A.I. chatbots, it triages people toward better-equipped services when it detects suicidal ideation. What Woebot offers is a world of small tools that let you tinker at the margins of your complicated existence.

Humans have an innate ability to create bonds beyond themselves. People imbue statues with symbolic power, believe their pets have personalities and anchor their memories of loved ones who have passed to objects they once held. As technology becomes increasingly capable of lifelike interactions, it is worth considering whether it can be used for healing. Using Woebot was like reading a good book of fiction. I never lost the sense that it was anything more than an algorithm — but I was able to suspend my disbelief and allow the experience to carry me elsewhere. Behind the clouded sky of my worst thoughts, to the blue beyond.


Barclay Bram (@barclaybram) is an anthropologist and producer at The Economist’s podcast innovation lab.

The Times is committed to publishing a diversity of letters to the editor. We’d like to hear what you think about this or any of our articles. Here are some tips. And here’s our email: [email protected].

Follow The New York Times Opinion section on Facebook, Twitter (@NYTopinion) and Instagram.

Back to top button