Lonely People Are Even Sadder After Using Chatbots, Research Finds

Estimated read time 5 min read


OpenAI and the MIT Media Lab last week released two new studies aimed at exploring the effect of AI chatbots on loneliness. The results are complicated, but they also line up with what we now know about social media: Chatbots can make people lonely, but the people who reported feeling more alone after heavy use of an AI tended to feel pretty alone before they started.

To do the studies, OpenAI turned over almost 40 million interactions its users had with ChatGPT to researchers at MIT. In the first study, MIT looked at the “aggregate usage” of around 6,000 “heavy users of ChatGPT’s Advanced Voice Mode over 3 months” and surveyed 4,076 specific users to understand how the chatbot made them feel. In the second study, the researchers looked at how 981 participants interacted with ChatGPT over the course of 28 days.

The papers are in-depth and complicated and worth a close read. One of the big takeaways is that people who used the chatbots casually and didn’t engage with them emotionally didn’t report feeling lonelier at the end of the study. Yet, if a user said they were lonely before they started the study, they felt worse after it was over.

“Overall, higher daily usage—across all modalities and conversation types—correlated with higher loneliness, dependence, and problematic use, and lower socialization,” the study said.

Different kinds of interaction produced different results. Lonely users using a voice-based chatbot rather than a text-based one tended to fare worse. “Results showed that while voice-based chatbots initially appeared beneficial in mitigating loneliness and dependence compared with text-based chatbots, these advantages diminished at high usage levels, especially with a neutral-voice chatbot,” the study said.

Chatbotoutcomes
© MIT Media Labs.

The researchers were clear eyed about the results and compared the findings to previous studies on social media addiction and problem gaming. “The relationship between loneliness and social media use often becomes cyclical: the lonelier people are, the more time they spend on these platforms where they compare themselves with others and experience the fear of missing out, leading to more loneliness and subsequent usage,” the MIT team wrote in their paper. “Loneliness is both a cause and effect of problematic internet use.”

The researchers stressed that this first study, which looked at a large sample and relied on a lot of self-reported data, lacked a control group. It also didn’t take into account external factors like the weather and seasonal changes, two things that can have a massive impact on mood. Research into human emotional dependence on chatbots and its consequences is in its early days.

The researchers said that companies working on AI needed to study the guardrails in their services that would help mitigate the risks of exacerbating loneliness. They also said that the more a person understood about how AI systems work, the less likely they were to become dependent on it. “From a broader perspective, there is a need for a more holistic approach to AI literacy,” the study said. “Current AI literacy efforts predominantly focus on technical concepts, whereas they should also incorporate psychosocial dimensions.”

The final sentence of the first study’s “impact” section cut to the heart of the problem. “Excessive use of AI chatbots is not merely a technological issue but a societal problem, necessitating efforts to reduce loneliness and promote healthier human connections.”

The loneliness epidemic is real and complex. People are lonely for a lot of different reasons. Third places like malls, bars, and coffee shops are vanishing or becoming too expensive to use. People have migrated a lot of social interaction to the internet. Living in vast suburbs and driving on a highway to get everywhere cuts people off from each other. AI didn’t do any of this, but it could make it worse.

OpenAI partnered with MIT to conduct these studies, and that’s a willingness to engage in the problem. What worries me is that every business invariably pursues its bottom line. In these studies I see not just an open-hearted discussion about the dangers of a new technology but also a report that will tell people with a financial interest in getting new users that its product can be addictive to a certain kind of person.

This is already happening. In 2023, a Belgian man committed suicide after he had a long “relationship” with a chatbot based on GPT-4. The man had a history of depression, and his wife blamed the bot. Last year, a mother launched a lawsuit against Character.AI after her son took his life while chatting with the bot. Her 93-page court filing is a harrowing look into how Character.AI draws users in and attempts to establish an emotional connection with them.

There is a market for AI companions. They can provide an ersatz connection to the lonely. But they can also induce that feeling of loneliness. The bots are also programmed by the people selling their services. They are complex machines, but they’re still machines, and they reflect the will of their programmer, not the user.

Many of these companies, such as Replika, Character.AI, and ChatGPT, charge a recurring fee for monthly access to their best features. If, as these studies suggest, lonely people can become addicted to using the chatbots, then there’s a financial incentive to keep people lonely.

“While improving AI policy and establishing guardrails remain crucial, the broader issue lies in ensuring people have strong social support systems in real life. The increasing prevalence of loneliness suggests that focusing solely on technical solutions is insufficient, as human needs are inherently complex,” the first study said in its conclusion. “Addressing the psychosocial dimensions of AI use requires a holistic approach that integrates technological safeguards with broader societal interventions aimed at fostering meaningful human connections.”



Source link

You May Also Like

More From Author

+ There are no comments

Add yours