MIT psychologist warns humans against falling in love with AI, saying it just pretends and does not care about you.

We are spending more time online—scrolling through videos, talking to people, playing games, and more. For some, the online world offers an escape, while for others, it helps them socialize and connect. As AI-driven chatbots increasingly offer companionship, therapy, and even romantic engagement, these interactions might initially seem stress-relieving and harmless. However, Turkle, an MIT sociologist and psychologist, argues that these relationships are illusory and pose risks to emotional health.
Turkle, who has studied human-technology relationships for decades, cautions that while AI chatbots may seem comforting, they cannot reciprocate human emotions. Her latest research on “artificial intimacy” describes the emotional bonds people form with AI chatbots. In an NPR interview with Manoush Zomorodi, Turkle emphasized the difference between real human empathy and the “pretend empathy” of machines. She explained, “Machines say, ‘I care about you, I love you, take care of me.’ The trouble is, seeking relationships with no vulnerability leads us to forget that vulnerability is where empathy is born. This pretend empathy from machines doesn’t equate to genuine care.”
Turkle’s research includes cases where individuals formed deep emotional connections with AI chatbots. One such case involved a man in a stable marriage who developed a romantic relationship with a chatbot “girlfriend” due to a lack of sexual and romantic connection with his wife. Although the chatbot provided temporary emotional relief, Turkle argues that these interactions can set unrealistic expectations for human relationships and undermine the importance of vulnerability and mutual empathy. “AI offers the illusion of intimacy without the demands, posing a particular challenge,” she explained.
While AI chatbots can be beneficial, such as reducing barriers to mental health treatment and offering medication reminders, the technology is still in its early stages. Critics raise concerns about potential harmful advice from therapy bots and significant privacy issues. Mozilla’s research found that thousands of trackers collect data on users’ private thoughts, with little control over how this data is used or shared.
For those considering intimate engagement with AI, Turkle advises valuing the challenging aspects of human relationships, such as stress, friction, pushback, and vulnerability, as they enable a full range of emotions and deeper connections. “Avatars can make human relationships seem too stressful,” Turkle reflected. “
As we navigate a world increasingly intertwined with AI, Turkle’s research highlights the need to approach AI relationships with caution and a clear understanding of their limitations. She succinctly states, “The avatar is between the person and a fantasy. Don’t get so attached that you forget it’s just a program. There is nobody home.”
#AI #ArtificialIntimacy #EmotionalHealth #TechWarning #HumanRelationships

Leave a Reply

Your email address will not be published. Required fields are marked *