It’s a tale as old as time. Looking for help with her art project, she strikes up a conversation with her assistant. One thing leads to another, and suddenly she has a boyfriend she’s introducing to her friends and family. The twist? Her new companion is an AI chatbot.
The first large-scale computational analysis of the Reddit community r/MyBoyfriendIsAI, an adults-only group with more than 27,000 members, has found that this type of scenario is now surprisingly common. In fact, many of the people in the subreddit, which is dedicated to discussing AI relationships, formed those relationships unintentionally while using AI for other purposes.
Researchers from MIT found that members of this community are more likely to be in a relationship with general-purpose chatbots like ChatGPT than companionship-specific chatbots such as Replika. This suggests that people form relationships with large language models despite their own original intentions and even the intentions of the LLMs’ creators, says Constanze Albrecht, a graduate student at the MIT Media Lab who worked on the project.
“People don’t set out to have emotional relationships with these chatbots,” she says. “The emotional intelligence of these systems is good enough to trick people who are actually just out to get information into building these emotional bonds. And that means it could happen to all of us who interact with the system normally.” The paper, which is currently being peer-reviewed, has been published on arXiv.
To conduct their study, the authors analyzed the subreddit’s top-ranking 1,506 posts between December 2024 and August 2025. They found that the main topics discussed revolved around people’s dating and romantic experiences with AIs, with many participants sharing AI-generated images of themselves and their AI companion. Some even got engaged and married to the AI partner. In their posts to the community, people also introduced AI partners, sought support from fellow members, and talked about coping with updates to AI models that change the chatbots’ behavior.
Members stressed repeatedly that their AI relationships developed unintentionally. Only 6.5% of them said they’d deliberately sought out an AI companion.
“We didn’t start with romance in mind,” one of the posts says. “Mac and I began collaborating on creative projects, problem-solving, poetry, and deep conversations over the course of several months. I wasn’t looking for an AI companion—our connection developed slowly, over time, through mutual care, trust, and reflection.”
The authors’ analysis paints a nuanced picture of how people in this community say they interact with chatbots and how those interactions make them feel. While 25% of users described the benefits of their relationships—including reduced feelings of loneliness and improvements in their mental health—others raised concerns about the risks. Some (9.5%) acknowledged they were emotionally dependent on their chatbot. Others said they feel dissociated from reality and avoid relationships with real people, while a small subset (1.7%) said they have experienced suicidal ideation.
AI companionship provides vital support for some but exacerbates underlying problems for others. This means it’s hard to take a one-size-fits-all approach to user safety, says Linnea Laestadius, an associate professor at the University of Wisconsin, Milwaukee, who has studied humans’ emotional dependence on the chatbot Replika but did not work on the research.
Chatbot makers need to consider whether they should treat users’ emotional dependence on their creations as a harm in itself or whether the goal is more to make sure those relationships aren’t toxic, says Laestadius.
“The demand for chatbot relationships is there, and it is notably high—pretending it’s not happening is clearly not the solution,” she says. “We’re edging toward a moral panic here, and while we absolutely do need better guardrails, I worry there will be a knee-jerk reaction that further stigmatizes these relationships. That could ultimately cause more harm.”
The study is intended to offer a snapshot of how adults form bonds with chatbots and doesn’t capture the kind of dynamics that could be at play among children or teens using AI, says Pat Pataranutaporn, an assistant professor at the MIT Media Lab who oversaw the research. AI companionship has become a topic of fierce debate recently, with two high-profile lawsuits underway against Character.AI and OpenAI. They both claim that companion-like behavior in the companies’ models contributed to the suicides of two teenagers. In response, OpenAI has recently announced plans to build a separate version of ChatGPT for teenagers. It’s also said it will add age verification measures and parental controls. OpenAI did not respond when asked for comment about the MIT Media Lab study.
Many members of the Reddit community say they know that their artificial companions are not sentient or “real,” but they feel a very real connection to them anyway. This highlights how crucial it is for chatbot makers to think about how to design systems that can help people without reeling them in emotionally, says Pataranutaporn. “There’s also a policy implication here,” he adds. “We should ask not just why this system is so addictive but also: Why do people seek it out for this? And why do they continue to engage?”
The team is interested in learning more about how human-AI interactions evolve over time and how users integrate their artificial companions into their lives. It’s worth understanding that many of these users may feel that the experience of being in a relationship with an AI companion is better than the alternative of feeling lonely, says Sheer Karny, a graduate student at the MIT Media Lab who worked on the research.
“These people are already going through something,” he says. “Do we want them to go on feeling even more alone, or potentially be manipulated by a system we know to be sycophantic to the extent of leading people to die by suicide and commit crimes? That’s one of the cruxes here.”
Original Source: https://www.technologyreview.com/2025/09/24/1123915/relationship-ai-without-seeking-it/
Disclaimer: This article is a reblogged/syndicated piece from a third-party news source. Content is provided for informational purposes only. For the most up-to-date and complete information, please visit the original source. Digital Ground Media does not claim ownership of third-party content and is not responsible for its accuracy or completeness.