Mental Health Chatbots: Kill It With Fire
This idea won't work, and is another in the long line of ways we overly-deify Big Tech even though Big Tech has mostly given us a clogged-pipes Internet and more problems.
I myself used “Woebot,” which is AI-guided therapy, probably in August and September of 2023. I disliked it pretty quickly, but kept doing it for a bit. For life context, at the time I had just been laid off by one job for seemingly no reason except my new boss didn’t like me, I was two rounds of IVF down with no hope of biological fatherhood, and I was so fed up with white-collar work that I was bartending. But, because I was also generating crappy money via “tips during inflation” + “I wasn’t a very good bartender really,” I couldn’t afford a “true therapist.” Plus, my last experience with therapy at that time had been a dud:
… so, I tried WoeBot.
Again, not the best experience. WoeBot is based on what everyone who loves AI says is good: “lots of user data.” So basically, you text with it and use some emojis, and it discerns how you are feeling and walks you through some exercises and discussions. The thing is, though, most of WoeBot is essentially “closed-loop,” or “restricted” in terms of … let’s say you want to type a full paragraph or two to the AI and say, “Hey, so, this is what’s really going on…” The system won’t let you usually, and instead will mostly tell you to select 1 to 3 pre-planned responses or emojis.
It’s restrictive in that sense, but it almost needs to be, as outlined last night on 60 Minutes:
Keep reading with a 7-day free trial
Subscribe to What Is Even Happening? to keep reading this post and get 7 days of free access to the full post archives.