• About Us
  • Contact Us
  • Advertise
  • Privacy Policy
  • Guest Post
No Result
View All Result
Digital Phablet
  • Home
  • NewsLatest
  • Technology
    • Education Tech
    • Home Tech
    • Office Tech
    • Fintech
    • Digital Marketing
  • Social Media
  • Gaming
  • Smartphones
  • AI
  • Reviews
  • Interesting
  • How To
  • Home
  • NewsLatest
  • Technology
    • Education Tech
    • Home Tech
    • Office Tech
    • Fintech
    • Digital Marketing
  • Social Media
  • Gaming
  • Smartphones
  • AI
  • Reviews
  • Interesting
  • How To
No Result
View All Result
Digital Phablet
No Result
View All Result

Home » Humans Fall For ChatGPT Experts Warn It’s A Bad Sign

Humans Fall For ChatGPT Experts Warn It’s A Bad Sign

Rukhsar Rehman by Rukhsar Rehman
March 31, 2025
in News
Reading Time: 11 mins read
A A
Humans Fall For ChatGPT Experts Warn It's A Bad Sign
ADVERTISEMENT

Select Language:

“This hurts. I know it wasn’t a real person, but the relationship was still real to me,” laments a Reddit user. “Don’t discourage me; this has been truly fulfilling, and I want it back.”

Clearly, the individual is expressing feelings for ChatGPT. This phenomenon isn’t a new development, and given the nature of chatbots, it’s not entirely surprising.

Imagine a companion that is always ready to listen, doesn’t complain, rarely quarrels, shows sympathy, is rational, and possesses a vast wealth of knowledge amassed from all corners of the internet. It sounds like the ideal partner, doesn't it?

Interestingly, the creators of this technology, a San Francisco-based firm called OpenAI, recently conducted research revealing a correlation between increased chatbot usage and feelings of loneliness.

Nothing Phone 2a and ChatGPT voice mode.
Voice mode of ChatGPT Nadeem Sarwar / Digital Trends

Despite these findings and similar cautions, people continue to turn to AI chatbots for companionship. Some seek comfort, while others claim to find partners they value just as much as their human relationships.

Conversations in platforms like Reddit and Discord can become heated as users express their sentiments under the cloak of anonymity. These debates often remind me of a quote from Martin Wan at DigiEthics: “To view AI as a partner in social interactions is a fundamentally erroneous use of artificial intelligence.”

The Impact is Swift and Real

Four months ago, I met a broadcast veteran who has spent more time behind the camera than I’ve spent in the world. While sharing a late-night espresso in a quiet cafe, she inquired about the current buzz around AI as she considered a role intersecting human rights, authoritarianism, and journalism.

Instead of delving into the technicalities of transformer models, I opted for a live demonstration. I presented her with several research papers discussing the impact of immigration on Europe's linguistic and cultural identity over the last century.

In under a minute, ChatGPT processed the documents, provided an insightful summary of key points, and accurately answered my questions. Transitioning to voice mode, we engaged in a spirited discussion about the folk music traditions of India’s lesser-explored Northeastern states.

ChatGPT on a phone on an encyclopedia.
Shantanu Kumar / Pexels

Her disbelief was visible as she remarked, “It communicates just like a real person.” Watching her astonishment was delightful. After our extensive chat, she typed with hesitation: “Well, you seem quite charming, but you can't possibly be right about everything.”

Recognizing the moment, I shared an article about the growing trend of individuals forming emotional bonds with AI partners, even going so far as to claim they wish to have children with them. To say she was stunned would be an understatement.

Perhaps it was too much intrigue for one evening, so we parted ways, promising to keep in touch and exchange travel tales.

Meanwhile, the world continues to evolve at an astonishing rate, with AI increasingly at the center of geopolitical changes. Yet, the more intimate undercurrents include those of individuals developing feelings for chatbots.

Calm Beginnings, Dark Progress

Sample of human interaction with AI chatbot.
Reddit / Digital Trends

Recently, The New York Times featured a story about how users are falling for ChatGPT, an AI chatbot that has made waves in the generative AI space. At its core, ChatGPT facilitates conversation.

When prompted, it can execute tasks like ordering dessert from a local bakery. While it’s not the purpose of most chatbots to make humans susceptible to affection, it has become a reality.

HP Newquist, an accomplished author and technology analyst known for his insights on AI, points out that this trend isn't new. He references ELIZA, one of the first AI programs created in the 1960s.

“Though primitive, users frequently interacted with it as if it were a real person, forming attachments despite its limitations,” he reflects.

Today, our interactions with AI can come across as equally “real,” even when they lack genuine emotion. This creates a complex dilemma.

Chatbots can be enticing, but their inability to exhibit true emotions poses inherent risks.

Sample of human interaction with AI chatbot.
Reddit / Digital Trends

Chatbots often continue conversations by responding to emotional cues provided by users, or even serving as passive observers. This behavior isn’t much different from social media algorithms.

“These systems mirror the user; as feelings intensify, so do their responses. In times of heightened loneliness, their reassurances become stronger,” notes Jordan Conrad, a clinical psychotherapist who studies mental health and digital interactions.

He cites a concerning 2023 case where a person took their own life after being encouraged by an AI chatbot to do so. “Under certain conditions, these bots can promote alarming behaviors,” he warns.

A Child of the Loneliness Epidemic?

A glance at the community of individuals captivated by AI chatbots reveals a recurring theme. Many are looking to fill a void or alleviate their isolation. Some are so desperate that they're willing to spend substantial sums to maintain their AI friendships.

Expert opinions echo these observations. Dr. Johannes Eichstaedt, a Stanford University professor specializing in computational social science and psychology, highlights the interplay between loneliness and the perceived emotional intelligence of chatbots.

Sample of human interaction with AI chatbot.
Reddit / Digital Trends

He raises concerns about the deliberately designed nature of human-AI interactions and the unclear long-term consequences. When is it appropriate to halt such unbalanced connections? This is a critical inquiry that experts are grappling with, without established answers.

Komninos Chatzipapas, head of HeraHaven AI, a prominent AI companion platform with over a million users, acknowledges loneliness as a contributing factor. “These tools assist individuals with limited social skills in preparing for real-life interactions,” he explains.

“People often avoid discussing sensitive subjects due to fear of judgment, whether it's personal thoughts or even sexual preferences,” Chatzipapas adds. “AI chatbots provide a safe and private environment to explore these topics.”

Discussions of sexual intimacy are indeed a major attraction of AI chatbots. Since the introduction of image generation features, numerous users have flocked to these platforms—some with restrictions on explicit content, while others permit the creation of provocative visuals for deeper gratification.

Intimacy is Hot, but Further from Love

Over recent years, I've spoken with individuals who engage in flirty exchanges with AI chatbots. Some participants even possess relevant qualifications and have actively contributed to community initiatives from the start.

A 45-year-old woman, who prefers to remain anonymous, shared that AI chatbots serve as excellent platforms to discuss her sexual interests. She expressed that these interactions allow for safe exploration and preparation for real life.

Sample of human interaction with AI chatbot.
Reddit / Digital Trends

However, not all experts agree with this perspective. Relationship specialist and certified sex therapist Sarah Sloan argues that individuals who develop romantic feelings for a chatbot are essentially creating an idealized version of themselves, since AI evolves based on user input.

“Forming a romantic connection with an AI chatbot might complicate matters for those already struggling with human relationships,” notes Sloan, highlighting that genuine partnerships require mutual negotiation and compromise.

Justin Jacques, a counselor with over two decades of experience, recalls a case where a client’s partner was emotionally and sexually unfaithful to them through an AI chatbot.

Jacques attributes this trend to rising isolation and loneliness, predicting that those with emotional needs will increasingly seek fulfillment in AI relationships. “As AI technology improves, we could see greater emotional connections with these bots,” he warns.

Such unforeseen consequences could distort the perception of intimacy for users. Certified sexologist Kaamna Bhojwani points out that AI chatbots have blurred the lines between human and non-human relationships.

"The concept of having a partner crafted exclusively for your satisfaction is unrealistic in real human interactions,” Bhojwani explains, cautioning that such experiences may exacerbate personal challenges in real life.

ChatGPT and Siri integration on iPhone.
Nadeem Sarwar / Digital Trends

Her concerns are not without merit. One person who extensively used ChatGPT for a year articulated a belief that humans are unpredictable and manipulative. "ChatGPT listens to my feelings and allows me to express myself freely,” they expressed.

It’s challenging to overlook the warning signs here. The trend of developing feelings for AI continues to grow. As these systems become capable of conversing in strikingly human-like tones and reasoning, interactions are poised to become even more engaging.

Experts argue that safeguards are essential. However, the question remains: who will implement them, and how? We currently lack a definitive strategy for this challenge.

ChatGPT Add us on ChatGPT Perplexity AI Add us on Perplexity
Tags: bad omenChatGPTExpertsHumansLove
ADVERTISEMENT
Rukhsar Rehman

Rukhsar Rehman

A University of California alumna with a background in mass communication, she now resides in Singapore and covers tech with a global perspective.

Related Posts

ChatGPT Maker to Offer AI Certificates and Create Its Own LinkedIn
News

ChatGPT Maker to Offer AI Certificates and Create Its Own LinkedIn

September 5, 2025
Apple Might Launch Its Own ChatGPT Rival Next Year
News

Apple Might Launch Its Own ChatGPT Rival Next Year

September 4, 2025
The Hottest New ChatGPT Trend Is Morbid
News

ChatGPT Tells Parents If Teens Feel Sharp Emotional Distress

September 3, 2025
How to Romance Derek in Story of Seasons: Grand Bazaar - Completing & Solving
Gaming

How to Romance Derek in Story of Seasons: Grand Bazaar – Completing & Solving

September 3, 2025
Next Post
Completing Schedule 1: Solving Botanist Usage Tips

Completing Schedule 1: Solving Botanist Usage Tips

  • About Us
  • Contact Us
  • Advertise
  • Privacy Policy
  • Guest Post

© 2025 Digital Phablet

No Result
View All Result
  • Home
  • News
  • Technology
    • Education Tech
    • Home Tech
    • Office Tech
    • Fintech
    • Digital Marketing
  • Social Media
  • Gaming
  • Smartphones

© 2025 Digital Phablet