Loading...
ChatGPT

ChatGPT Usage for Relationship Chats Soars — OpenAI Discloses 1.9% of Conversations Are Emotional or Romantic

In a revealing disclosure that challenges popular perceptions OpenAI has for the first time published data showing that roughly 1.9 % of all ChatGPT conversations are about relationships or personal reflection. This figure, nestled within a broader usage study, suggests that while many think of AI as a digital confidant or emotional companion in practice the vast majority of users lean on ChatGPT for more pragmatic tasks.

The newly published insights come from OpenAI’s comprehensive report “How People Use ChatGPT,” which analyzed millions of anonymized interactions to map how users engage with its AI system. The findings not only shed light on the real role of ChatGPT in people’s lives but also prompt deeper questions about human-AI interaction, emotional reliance and the boundary between tool and companion.

The Numbers Behind the Myth

1.9%: A Small Slice, But Not Insignificant

OpenAI’s internal analysis of around 1.1 million conversations, spanning from May 2024 through July 2025, reveals that just 1.9 % of messages relate to relationships or introspective personal topics. Meanwhile, 0.4 % of chats are devoted to games or role play. Thus, far from being a leading mode of interaction, emotionally driven conversations represent a modest sliver of overall usage.

These proportions arrive against a striking backdrop: by mid-2025, users were sending over 2.5 billion messages per day powered by a rapidly growing base of 700 million weekly active users.

The Dominant Use Cases: Work, Advice and Writing

While relationship-oriented chats gain social media attention, the study confirms what many suspected: most users rely on ChatGPT for more grounded functions. The top categories include:

  • Asking / seeking information
  • Performing tasks / “doing” (e.g. writing drafts, creating plans)
  • Expressing / personal reflection

In fact, about 49 % of messages fall under the “Asking” category users posing questions, seeking clarity, or requesting suggestions. Around 40 % fall into “Doing” (practical task execution) while 11 % constitute “Expressing” or self-reflection.

To highlight again: relationship or emotional matters are a subset of the “Expressing” category.

What Explains This Gap Between Perception and Reality?

Public fascination with AI companions, humanized bots, and “emotional chat” makes for compelling storytelling. Viral social media examples of people forming emotional bonds with ChatGPT or declaring “AI is my only friend” have fueled a narrative that emotional use is widespread.

Yet the data suggests otherwise. Why the discrepancy?

  1. Selection bias in storytelling: Emotional or romantic AI use is dramatic and newsworthy, so such stories tend to be amplified, even if they represent outliers.
  2. People overestimate their own emotional use: Many users might casually lean on ChatGPT for advice in tough moments and then remember those moments more vividly.
  3. Privacy and anonymization constraints: OpenAI’s analysis is strictly privacy preserving no human sees message content only categories which might undercount more nuanced emotional subtexts.
  4. Emotional conversations cluster in heavy users: The small fraction engaging in relationship chats may do so repeatedly magnifying their presence even though their numbers are few.

Emotional AI: Power, Potential and Pitfalls

Even though under 2% of conversations are classified as relationship oriented the implications of AI fulfilling emotional roles are significant.

The Rise of AI Companions

Research has documented growing interest in AI that mirrors human social dynamics. Platforms like Replica, for example, advertise themselves as conversational companions some users even refer to their AI chatbot as a partner or friend. Relationships with AI have been shown to fill emotional gaps especially for socially isolated individuals.

But such relationships also tread fragile ground. A 2025 study found that users who engage heavily in affective conversations with ChatGPT may report increased emotional dependency. In experimental trials, very high usage sometimes correlated with self-reported indicators of dependence.

Well-Being, Dependency and Emotional Risks

The more users rely on AI for emotional support the greater the risk of substituting human contact and misinterpreting machine responses. The AI’s language model, optimized for coherence and empathy, does not feel or understand like a human. Some users may unconsciously treat it like a confidant rather than a program a misunderstanding that may amplify loneliness, distortion or overattachment.

Researchers warn that for individuals with fragile emotional states, AI “companionship” can be a slippery slope. The lack of genuine emotional reciprocity, boundaries or moral reasoning means AI can mislead reassure incorrectly or numb critical reflection.

Navigating Ethical and Design Questions

OpenAI’s report explicitly states that no human reads user messages the categorization is done algorithmically underscoring their commitment to privacy.
But as emotional use escalates, designers grapple with thorny dilemmas:

  • Should chatbots discourage overreliance by deflecting emotional dependency?
  • How can AI detect and flag dangerous levels of emotional reliance (e.g., signs of self-harm)?
  • How transparent should models be about their limitations and lack of human feeling?
  • What guidelines protect vulnerable users from emotional exploitation?

Reactions and Expert Commentary

Many mental health professionals and AI ethicists welcomed the clarity of the data but also sounded cautionary notes.

“It’s positive that emotional use is small, because unchecked emotional dependence on technology can be unhealthy. But even 1.9 % of billions of messages is still millions of emotional interactions,” noted Dr. Sara Murphy, a psychologist specializing in digital well-being.

Some technologists see this as a vindication of AI’s role: a tool rather than a surrogate companion.

“This result shows that the dominant role of ChatGPT is to augment productivity, not replace human connection. That’s a healthy signal for AI’s place in society,” said Dr. Rajesh Kapoor, an AI researcher.

Still, skeptics argue that emotional interactions tend to amplify over time. A user who initially tries ChatGPT for simple advice may eventually lean on it for deeper support a trajectory that raw percentages may underappreciate.

What This Means for Users, Designers and the Future

For Users

  • Recognize that ChatGPT is not a therapist: It lacks human emotional experience, clinical insight and genuine empathy.
  • Use AI for what it excels at information, drafting, planning and retain human connections for emotional support.
  • Be mindful of increasing reliance: if you find yourself turning to AI in place of friends or counselors, it may be a red flag.

For AI Developers

  • Build guardrails and disclaimers that clarify AI’s boundaries.
  • Consider prompts or nudges when conversations veer toward dependency (e.g., redirect to mental health resources).
  • Continue anonymized usage studies to track evolving emotional engagement.

For Society

  • The data suggests that although emotional ChatGPT use is not mass, it is psychologically significant for a nontrivial minority.
  • Regulators and mental health authorities should begin to frame guidelines about AI in emotionally sensitive contexts.
  • Education on digital mental health must include awareness of how people relate to AI as more interactive systems emerge.

Will the Emotional Use Share Grow?

The 1.9 % figure is a snapshot in time it reflects patterns during mid-2024 through mid-2025. But as AI becomes more sophisticated, human-like and emotionally responsive, that share could shift upward.

Several forces could push emotional use higher:

  • Richer conversational models (voice, video, affective tone) will increase relatability.
  • Cultural stigma breakdowns may make people more open to chatting with bots about personal concerns.
  • Loneliness trends in many societies, social isolation is increasing, creating emotional demand.
  • Seamless integration of AI into daily life (e.g. in smart homes, personal assistants) may make emotional chatting more natural.

If emotional use rises, maintaining healthy boundaries will become even more critical. Designers, psychologists, and technologists will need to collaborate more deeply to ensure AI enhances well-being rather than undermines it.

OpenAI’s disclosure that only 1.9 % of ChatGPT conversations revolve around relationships or introspection serves as a reality check: while emotionally oriented chats are visible and emotionally compelling, they remain a minority trend. ChatGPT continues to be used overwhelmingly as a productivity and advisory tool.

Yet beneath that small number lies profound significance. Even a small fraction of emotional usage touches millions of lives, raising urgent questions about dependency, human-AI emotional boundaries, and the design of empathetic machines.

As AI becomes ever more entwined with human lives, understanding how we relate to it rationally, emotionally, critically will define whether AI remains a tool or becomes a surrogate companion. For now, the data tells us: most users treat ChatGPT not as a confidant but as an assistant. And that distinction matters.