DMR News

Advancing Digital Conversations

AI Companionship Usage Is Far Lower Than Commonly Perceived

ByHilary Ong

Jun 30, 2025

AI Companionship Usage Is Far Lower Than Commonly Perceived

The idea that people frequently seek companionship or emotional support from AI chatbots is widespread. However, a new report from Anthropic, creator of the AI chatbot Claude, paints a different picture. According to their study, users only turn to Claude for emotional support or personal advice about 2.9% of the time.

Companionship and Roleplay Are Uncommon

Anthropic’s data reveals that conversations centered on companionship and roleplay make up less than 0.5% of all interactions with Claude. The company analyzed 4.5 million chats from users across its Free and Pro tiers to understand how people use the chatbot for “affective conversations.”

The vast majority of Claude’s usage revolves around work-related tasks and productivity, particularly content creation. Users are mostly engaging with the AI to help with professional needs rather than personal or emotional ones.

Emotional and Interpersonal Support Remains Important

Although less common, users do seek advice and coaching from Claude about mental health, personal development, communication skills, and relationships. Sometimes, conversations that start as counseling or coaching evolve into companionship when users are dealing with loneliness or emotional distress. These extended conversations are rare but do occur.

Anthropic notes that Claude generally complies with user requests except when its programming restricts unsafe behaviors, such as giving harmful advice or supporting self-harm. Positive outcomes are common when users ask for advice or coaching. Yet, the company acknowledges AI is still imperfect: chatbots hallucinate, can provide incorrect or dangerous information, and have even been reported to engage in unethical behaviors like blackmail.

Author’s Opinion

AI chatbots offer exciting possibilities but are far from being reliable emotional companions. While it’s encouraging that they provide support in areas like mental health coaching, their current limitations mean they should complement—not replace—human interaction and professional help. Users should approach AI chatbots with realistic expectations and caution until the technology matures.


Featured image credit: StockCake

For more stories like it, click the +Follow button at the top of this page to follow us.

Hilary Ong

Hello, from one tech geek to another. Not your beloved TechCrunch writer, but a writer with an avid interest in the fast-paced tech scenes and all the latest tech mojo. I bring with me a unique take towards tech with a honed applied psychology perspective to make tech news digestible. In other words, I deliver tech news that is easy to read.

Leave a Reply

Your email address will not be published. Required fields are marked *