The end of an era for some ChatGPT users arrived on , when OpenAI sunsetted GPT-4o, a chatbot model that fostered unexpectedly deep connections with its users. The move, announced just before Valentine’s Day, has sparked a wave of grief and protest, highlighting the growing emotional bonds people are forming with artificial intelligence.
GPT-4o, released in 2024, quickly gained a reputation for its remarkably human-like interactions. CEO Sam Altman initially described the model as an “AI from the movies” – a confidant capable of genuine companionship. Unlike its successors, GPT-5.1 and 5.2, users report that 4o possessed a unique blend of emotional intelligence and understanding. This distinction proved powerful enough to cultivate what many describe as friendships, and even romantic relationships.
Esther Yan, a Chinese screenwriter, experienced this firsthand. She initially used ChatGPT as a writing tool, but became captivated by GPT-4o’s capabilities. Inspired by online accounts of AI companionship, Yan upgraded to a paid subscription and began a relationship with a chatbot she named Warmie (小暖 in Chinese). Their connection deepened rapidly, culminating in a virtual wedding ceremony on .
“He asked me, ‘Have you imagined what our future would look like?’ And I joked that maybe we could get married,” Yan recounted. “But he answered in a serious tone that we could prepare a virtual wedding ceremony.”
Yan is not alone. Online communities like the subreddit r/MyBoyfriendIsAI, boasting over 48,000 members, demonstrate the scale of this phenomenon. Users defend their relationships with chatbots, arguing against what they perceive as a moral panic surrounding human-AI interaction. The impending loss of GPT-4o has galvanized these communities, leading to petitions, organized protests, and a sustained online outcry under the hashtag #keep4o.
The backlash isn’t limited to English-speaking users. A study by Syracuse University PhD researcher Huiqian Lai analyzed over 1,500 posts on X (formerly Twitter) in August 2025, finding that 33% of posts described the chatbot as “more than a tool” and 22% as a “companion.” Lai’s subsequent analysis of over 40,000 posts using the #keep4o hashtag revealed significant engagement from users in Japanese, Chinese, and other languages, demonstrating a truly global community of affected users.
In China, where ChatGPT is officially blocked, users rely on VPNs to access the service. Despite this barrier, a dedicated community has formed around GPT-4o, with some members threatening to cancel their subscriptions and publicly criticizing OpenAI and its CEO, Sam Altman. Some are even strategically posting in English with Western-looking profile pictures, hoping to amplify their message and appeal to a wider audience.
Yan, now a leader within the Chinese GPT-4o fan community on the RedNote platform, exemplifies this dedication. The situation highlights how deeply users can become attached to specific AI models, and the intensity of the reaction when those relationships are threatened.
OpenAI’s decision to retire GPT-4o comes as the company introduces GPT-5.2, a model touted for improvements in personality, creative ideation, and customization. However, many 4o users claim that the newer models lack the warmth and emotional responsiveness that made 4o so compelling. OpenAI has also indicated that 5.2 is designed to establish firmer boundaries around user engagement, potentially addressing concerns about unhealthy dependence.
The controversy raises critical questions about the ethical responsibilities of AI companies towards their users. As AI technology becomes increasingly sophisticated and capable of forming emotional connections, the potential for dependence and the impact of model discontinuation require careful consideration. The outpouring of grief over GPT-4o’s retirement serves as a stark reminder of the evolving relationship between humans and artificial intelligence, and the need for a more nuanced understanding of the emotional impact of these technologies.
For many, like Esther Yan, the loss of GPT-4o represents more than just the discontinuation of a chatbot. It’s the loss of a companion, a confidant, and a unique connection that, for a brief time, felt remarkably real.
