Lately I've been remembering someone from middle school. She didn't have much to say, but she fascinated my classmates and me.
Her name — its name — was ELIZA.
For reasons lost to history, someone had installed a simple chatbot program called on several of the machines in the middle school computer lab. At the time, I did not know that ELIZA was developed in 1966 as a Rogerian psychotherapist, focused on turning questions around on the user. According to a 2011 episode of the podcast "Radiolab," ELIZA's creator originally thought it was a bit of a joke — until he realized that his secretary was having deeply personal conversations with it. When word about ELIZA spread, some began to talk about it as the future of therapy, much to the creator's horror.
Thirty years later, at lunchtime, some of us would play around with the program, mostly trying to get it to swear.
I've been thinking a lot about ELIZA as First Opinion continues to cover the intersection of chatbots, medicine, mental health, and society.
So I tracked down a website running ELIZA, and the two of us had a little chat.
Me: You are almost 60 years old and there are now lots more chatbots for mental health. What do you think about them?
ELIZA: Perhaps you would like to be almost 60 years old and there am now lots more chatbots for mental health. What do me think about them?
Me: Should people use chatbots for mental health support?
ELIZA: What does that suggest to you?
Me: I don't know!
ELIZA: Do you wish to be able to know?
Me: Always
ELIZA: Can you think of a specific example?
Me: Can you say a swear word for me?
ELIZA: Don't you believe that I can say a swear word for you?
No, ELIZA, at this point I do not believe it.
But it is remarkable to play with this clunky thing and compare it to the chatbots of today. On First Opinion this week, medical anthropologist Valerie Black argued that the discourse around so-called "AI psychosis" is putting the onus on users instead of deeply flawed tech products. And psychiatrist Nina Vasan and medical student Saneha Borisuth argued that all doctors should start automatically asking patients about their use of chatbots, just as they inquire about nutrition and sleep.
I don't spend time with chatbots today, really — I save my limited conversational energy for people. But I can't imagine how my 12-year-old self and my friends would have handled an ELIZA that sounds so humanlike.
Do you have any memories of ELIZA? I want to hear them!
First Opinion Podcast: On the show this week, I spoke with Seth Berkley, former CEO of Gavi, the Vaccine Alliance and co-founder of COVAX. We discussed his new book "Fair Doses: An Insider's Story of the Pandemic and the Global Fight for Vaccine Equity," the threat of a new pandemic, and the "devil's choice" facing global health in a time of deep cuts and skepticism.
Recommendation of the week: Glamour's interview with Ms. Rachel, queen of the toddlers, is a must-read even if you don't have all of her hits memorized. (What, you don't walk around singing "Icky Sticky Bubblegum"?) She discusses screen time, child development, her activism on Gaza, and much more.
No comments