Categories Technology

How chatbots — and their makers — are enabling AI psychosis

The explosive growth of AI chatbots in the past three years, since ChatGPT launched in 2022, has started to have some really noticeable, profound, and honestly disturbing effects on some users. There’s a lot to unpack there — it can be pretty complicated.

So I’m very excited to talk with today’s guest, New York Times reporter Kashmir Hill, who has spent the past year writing thought-provoking features about the ways chatbots can affect our mental health. 

One of Kashmir’s recent stories was about a teenager, Adam Raine, who died by suicide in April. After his death, his family was shocked to discover that he’d been confiding deeply in ChatGPT for months. They were also pretty surprised to find, in the transcripts, a number of times that ChatGPT seemed to guide him away from telling his loved ones. And it’s not just ChatGPT: Several families have filed wrongful death suits against Character AI, alleging that a lack of safety protocols on the company’s chatbots contributed to their teenage kids’ deaths by suicide.

Then there are the AI-induced delusions. You’ll hear us talk about this at length, but pretty much every tech and AI reporter — honestly, maybe every reporter, period — has seen an uptick in the past year of people writing in with some grand or disturbing discovery that they say ChatGPT sparked. Sometimes these emails can be pretty disturbing. And as you’ll hear Kashmir explain, plenty of the people who get into these delusional spirals didn’t seem to suffer from mental illness in the past.

It’s not surprising that a lot of people want somebody to do something about it, but the who and the how are hard questions. Regulation of any kind seems to be pretty much off the table right now — we’ll see — so that leaves the companies themselves. You’ll hear us touch on this a bit, but not long after we recorded this conversation, OpenAI CEO Sam Altman wrote a blog post about new features that would theoretically, and eventually, identify users’ ages and stop ChatGPT from discussing suicide with teens.

But as you’ll hear us discuss, it seems like a big open question if those guardrails will actually work, how they’ll be developed, and when we’ll see them come to pass.

If you’d like to read more on what we talked about in this episode, check out the links below:

  • A teen was suicidal. ChatGPT was the friend he confided in. | New York Times
  • Sam Altman says ChatGPT will stop talking about suicide with teens | The Verge
  • Chatbots can go into a delusional spiral. Here’s how. | New York Times
  • Why is ChatGPT telling people to email me? | New York Times
  • They asked an AI chatbot questions. The answers sent them spiraling. | New York Times
  • She is in love with ChatGPT | New York Times
  • ‘I feel like I’m going crazy’: ChatGPT fuels delusional spirals | Wall Street Journal
  • Meta, OpenAI face FTC inquiry on chatbots’ impact on kids | Bloomberg

Questions or comments about this episode? Hit us up at decoder@theverge.com. We really do read every email!

If you or someone you know is considering suicide or is anxious, depressed, upset, or needs to talk, there are people who want to help.

In the US:

Crisis Text Line: Text HOME to 741-741 from anywhere in the US, at any time, about any type of crisis.

988 Suicide & Crisis Lifeline: Call or text 988 (formerly known as the National Suicide Prevention Lifeline). The original phone number, 1-800-273-TALK (8255), is available as well.

The Trevor Project: Text START to 678-678 or call 1-866-488-7386 at any time to speak to a trained counselor.

Outside the US:

The International Association for Suicide Prevention lists a number of suicide hotlines by country. Click here to find them.

Befrienders Worldwide has a network of crisis helplines active in 48 countries. Click here to find them.

Original Source: https://www.theverge.com/podcast/779974/chatgpt-chatbots-ai-psychosis-mental-health

Original Source: https://www.theverge.com/podcast/779974/chatgpt-chatbots-ai-psychosis-mental-health

Disclaimer: This article is a reblogged/syndicated piece from a third-party news source. Content is provided for informational purposes only. For the most up-to-date and complete information, please visit the original source. Digital Ground Media does not claim ownership of third-party content and is not responsible for its accuracy or completeness.

More From Author

Leave a Reply

Your email address will not be published. Required fields are marked *