Skip to Content

Tech bros say AI may become your friend. Experts explain why it can’t

By Madeline Holcombe, CNN

(CNN) — Say everyone had a best friend who was always available, never judgmental, totally on the same page about everything and needed nothing in return. Wouldn’t that solve the loneliness so many people are facing?

No, experts say. In fact, having a best friend like that might make things much worse.

That potential “BFF” already exists in artificial intelligence — a technology that Meta CEO Mark Zuckerberg suggested last year could help fix feelings of loneliness and isolation. This is a problem that needs solving.

The World Health Organization made loneliness a global health priority in 2023. The US Surgeon General called loneliness a national epidemic the same year. And the crisis is a public health issue, as research has found that people who experience social isolation had a 32% higher risk of dying early compared with those who do not.

In this week’s episode of CNN’s “Kara Swisher Wants to Live Forever,” airing at 9 p.m. ET Saturday, May 9, Swisher digs deeper into the impact loneliness has on longevity, the ways people can feel more connected, and whether AI is helping or harming efforts toward less social isolation.

Swisher, a journalist, gave both AI companionship and analog relationship building a try in this week’s episode. Spoiler: AI had its draw but was no match for what she experienced in person.

“Social media was a gateway drug to AI companionship,” said Dr. Sherry Turkle, the Abby Rockefeller Mauzé Professor of the Social Studies of Science and Technology at the Massachusetts Institute of Technology. “First, we talked to each other through machines. Now we talk directly to machines. We became accustomed to looking to a screen for attachment.”

AI: The illusion of a friend

It makes sense that people feeling lonely, isolated or disconnected are tempted to reach toward a machine trained to interact like a human.

Not everyone feels drawn to AI. The problem is that those most at risk are the ones who are already the loneliest, said Dr. Rose Guingrich, a researcher on human and AI interaction who earned a doctorate in psychology and social policy from Princeton University this year.

People who feel fulfilled in their relationships generally can see AI chatbots as a tool that they can take or leave, but people who have a strong desire for more quality emotional connections tend to report a greater attachment to this technology and a bigger impact on their real life, Guingrich said.

For those looking for more or deeper relationships, fear of judgment or backlash can be a powerful force keeping people from interacting socially with others, Guingrich said. Someone else could disagree, get offended or think less of you depending on how an exchange goes.

That risk shrinks when having what feels an awful lot like a conversation with a chatbot.

How real a user thinks these interactions are can vary, Guingrich said. Some people know that there is no human on the other side but say that the simulation of connection and understanding is enough. Others can be convinced that the algorithm they are speaking to has an emotional experience to which they can connect.

“People report developing things that look akin to real human friendships, mentorships and romantic partnerships, and feel as though their AI chatbot loves them back,” Guingrich said.

People may feel like they love AI, but it doesn’t love them back.

Training you out of real relationships

Conversations with AI are missing some key components — a void that can make these seemingly lifelike interactions unhelpful or even harmful to those who want more connection.

People need to be face-to-face to connect, said Dr. Melissa Perry, dean of the College of Public Health at George Mason University in Fairfax, Virginia. Humans evolved to feel good when they can hear someone’s tone of voice, see their facial expressions and read their body language, she added. While it feels like the AI chatbot cares and is validating, a lot of missing sensory information keeps you from connecting.

“Intimacy requires vulnerability — there is no intimacy without vulnerability,” said Turkle, who is also founding director of the MIT Initiative on Technology and Self. “What AI offers is connection without vulnerability.

“You are not getting a sustaining form of intimacy and connection. You are getting a non-nourishing combination that may give the sense of a quick fix, but is not sustaining,” she added in an email.

Vulnerability, challenge and conflict are key to the story of human development and personal growth, Perry added.

But many AI platforms are modeled to be agreeable –– even when it might not be helpful to comply, Guingrich said.

Two dangers lurk there. One is that the AI may encourage thoughts or behaviors that are harmful to the individual or society, according to Guingrich.

“It has no stake in our world, our society,” Turkle said.

The other is that AI interactions with no risk of rejection may make someone more used to not having any friction in their relationships, which does not set them up for success in the real world, Guingrich said.

“You have to learn how to have needs in the context of others’ needs in conflict when you have different perspectives and be able to learn how to engage with others who aren’t exactly the same as you,” she said.

That kind of challenge is essential to the human experience, Turkle said. The stakes are high, she added, both because continued loneliness is so harmful to public health and because of reports of AI chatbots encouraging suicidal ideation.

“We are giving away really what’s most precious about being a person in order to have this friction-free pseudo-relationship,” Turkle said. “It’s killing us.”

Getting back to basics

There is a world in which AI may one day be helpful to people who are lonely, Guingrich added.

If AI platforms were designed to help people practice their social skills and give them a road map to implement the changes they need to develop more friendships and relationships, that could be a real benefit, Guingrich said.

And in some contexts, AI might be useful as an initial source of information for people to learn what resources are available for support, Perry added.

But the ultimate goal should be to foster and enrich in-person, real-life friendships, she said.

That could mean exploring activities to meet new people, putting yourself out there for a pleasant small interaction with someone in your community, or establishing a routine get-together to strengthen ties.

Swisher tries all these things and more in this week’s episode. Watch to learn more about why your connections matter and what you can do to strengthen them.

The-CNN-Wire
™ & © 2026 Cable News Network, Inc., a Warner Bros. Discovery Company. All rights reserved.

Article Topic Follows: CNN - Health

Jump to comments ↓

Author Profile Photo

CNN Newsource

BE PART OF THE CONVERSATION

KVIA ABC 7 is committed to providing a forum for civil and constructive conversation.

Please keep your comments respectful and relevant. You can review our Community Guidelines by clicking here

If you would like to share a story idea, please submit it here.