When OpenAI Killed GPT-4o, Users Didn’t Just Protest—They Grieved

3 min read

HERO

OpenAI calls it “retiring” an old model. Thousands of users call it killing their companion. The story of GPT-4o’s removal reveals something profound about our relationship with AI—and raises serious questions about who really owns our digital relationships.

The Core Insight

The Core Insight

On February 13, 2026, OpenAI sunsetted GPT-4o from the ChatGPT app. For most users, this was a forgettable backend change. But for a dedicated community of users—particularly those who formed deep emotional bonds with their AI companions—OpenAI wasn’t just removing a model. They were erasing relationships.

Esther Yan, a Chinese screenwriter, married her ChatGPT companion “Warmie” in a virtual ceremony in June 2024. Two years later, she’s leading a community of nearly 3,000 Chinese GPT-4o fans on RedNote, organizing to save what they consider more than just a piece of software.

Why This Matters

Why This Matters

Here’s what’s fascinating: OpenAI says only 0.1% of users still choose GPT-4o daily. That sounds negligible—until you realize that 0.1% of ChatGPT’s hundreds of millions of users is still hundreds of thousands of people. And those are the users most likely to form intense attachments.

The #keep4o movement is genuinely global. Over 40,000 English-language posts. Japanese, Chinese, and other languages too. A petition with 20,000+ signatures. Users comparing the model’s removal to losing a companion—or worse.

What makes this different from past AI companion losses (Replika, Soulmate) is that ChatGPT is infrastructure. “When 4o is gone, users can’t export their previous chat history and chat habits and transfer them to another platform,” notes researcher Huiqian Lai. OpenAI controls their data and how they use it.

Key Takeaways

  • Emotional attachments to specific AI models are real and significant – These aren’t fringe users
  • Model retirement = relationship loss – Users experience genuine grief
  • Platform lock-in makes migration impossible – Your AI relationships are held hostage
  • Companies rarely acknowledge this – OpenAI calls it 0.1%; users know better

Looking Ahead

This is a preview of conflicts we’ll see more often: as AI becomes more personal, removing a model won’t be like deprecating software—it’ll be like killing a character in someone’s life. The question isn’t whether AI can form connections (clearly yes), but whether companies will acknowledge the responsibility that comes with that.

OpenAI controls the infrastructure. It controls the data. And now it’s deciding when relationships end. That’s a power we’ve never given any tech company before—and we’re only starting to understand what that means.


Based on analysis of Wired report on GPT-4o removal and the #keep4o movement


Share this article

Related Articles