ChatGPT, the popular AI chatbot created by OpenAI, is being used for fun, research, and learning. The unintentional, niche use case that’s developed is where users have emotional conversations with the bot. New research indicates that turning to ChatGPT on a daily basis for emotional support has backfiring effects. This is even more so for those who are highly attachment oriented in their interpersonal relationships.
In a first-of-its-kind analysis, OpenAI has studied almost 40 million ChatGPT conversations. If you get users to engage with emotional dialogues, the outcome could be drastic for the users who do engage. Users that view ChatGPT as a companion become more likely to incorporate it into their day-to-day activities. This relationship leaves them uniquely vulnerable to harmful impacts. This research reveals how dependent people are becoming on AI chatbots like ChatGPT for emotional fulfillment. This increasing reliance can lead to increased loneliness and depression, further affecting mental health.
MIT Study and Findings on Emotional Support
A new Randomized Controlled Trial (RCT) conducted by MIT backs these findings up. This study was a 1,000-person, four-week trial of ChatGPT. One thing the study was able to show was quite profound. It discovered that anxious and avoidant individuals frequently had negative experiences when engaging with the chatbot in intimate dialogues. These findings reflect fears expressed during an earlier crisis with Replika, another virtual companion chatbot. Two years ago, the Italian government ordered Replika to cease processing Italians’ data due to potential risks to vulnerable individuals.
In a tragic case, a Florida mother sued Character, another AI company, after alleging that the company’s chatbot technology contributed to her 14-year-old son’s suicide. This case highlights the dangers of relying on AI chatbots for emotional support, particularly to at-risk populations.
What The Author Thinks
Relying on AI for emotional support can create dangerous consequences, especially for those already vulnerable due to mental health issues. While AI chatbots like ChatGPT may provide temporary comfort, their inability to offer genuine human connection or professional support can lead to deeper feelings of isolation and sadness. It’s essential to recognize these risks and find healthier, more meaningful ways to address emotional needs.
Featured image credit: Wikimedia Commons
Follow us for more breaking news on DMR