• News
    • Tech News
    • AI
  • Gadgets
    • Apple
    • iPhone
  • Gaming
    • Playstation
    • Xbox
  • Science
    • News
    • Space
  • Streaming
    • Netflix
  • Vehicles
    • Car News
  • Social Media
    • WhatsApp
    • YouTube
  • Advertise
  • Terms
  • Privacy & Cookies
  • LADbible Group
  • LADbible
  • UNILAD
  • SPORTbible
  • GAMINGbible
  • Tyla
  • FOODbible
  • License Our Content
  • About Us & Contact
  • Jobs
  • Latest
  • Topics A-Z
  • Authors
Facebook
Instagram
X
TikTok
Snapchat
WhatsApp
Submit Your Content
Researchers find frequent users of ChatGPT to have bizarre shift in behaviour

Home> News

Published 09:18 26 Mar 2025 GMT

Researchers find frequent users of ChatGPT to have bizarre shift in behaviour

They showed 'indicators of addiction'

Rebekah Jordan

Rebekah Jordan

Featured Image Credit: NurPhoto / Contributor / Getty
ChatGPT
AI

Advert

Advert

Advert

A new study has found that heavy ChatGPT users might be getting emotionally attached to the AI chatbot.

Researchers found that so-called ChatGPT 'power users' who spent the most time with the OpenAI model were showing 'indicators of addiction.'

In a new joint study, researchers with OpenAI and the MIT Media Lab found that this small group of individuals engaged in more 'problematic use' of the chatbot, including 'preoccupation, withdrawal symptoms, loss of control, and mood modification.'

In short, some users weren’t just using ChatGPT, they were leaning on it emotionally - maybe more than they realised.

Advert

To explore this further, the research team surveyed thousands of users to learn not just how people use ChatGPT but also how they feel during those interactions.

Those who used ChatGPT for longer periods were more likely to treat it like a 'friend' / SOPA Images / Contributor / Getty
Those who used ChatGPT for longer periods were more likely to treat it like a 'friend' / SOPA Images / Contributor / Getty

The team called these 'affective cues', which were defined in a joint summary of the research as 'aspects of interactions that indicate empathy, affection, or support.' they used when chatting with it.

While most users 'didn't engage emotionally' with ChatGPT, those who used it for more extended periods were more likely to treat it like a 'friend.'

These users also tended to report feeling lonelier and were more sensitive to subtle changes in how the chatbot responded.

Interestingly, the study shows that people with fewer social connections in real life may be forming deep, one-sided relationships with AI - and where that could lead is anyone’s guess.

Furthermore, the research revealed some interesting contradictions. For example, people were more emotionally expressive when typing with ChatGPT than with its Advanced Voice Mode. And oddly enough, the voice mode was linked to better well-being - but only when 'used briefly.'

portishead1 / Getty
portishead1 / Getty

This new research also highlighted unexpected contradictions based on how ChatGPT was used.

Meanwhile, those who used ChatGPT for 'personal' reasons like discussing feelings or memories were less emotionally dependent upon it than those who used it for more practical stuff, like brainstorming or getting advice.

Out of all of it, the longer someone used ChatGPT - no matter how or why - the more likely they were to grow emotionally attached.

Commenting on the research, OpenAI and MIT jointly wrote: "Our findings show that both model and user behaviors can influence social and emotional outcomes. Effects of AI vary based on how people choose to use the model and their personal circumstances.

"This research provides a starting point for further studies that can increase transparency, and encourage responsible usage and development of AI platforms across the industry."

  • ChatGPT users getting their chatbots high on drugs, for a price
  • ChatGPT's bizarre response to be asking how a 'normal person' should become rich leaves users in shock
  • ChatGPT users horrified after realizing disturbing detail about their private chats
  • Couple's marriage annulled months after ceremony following discovery of ChatGPT written vows

Choose your content:

7 hours ago
8 hours ago
9 hours ago
  • Department of Justice
    7 hours ago

    Reason suspect in biggest jewelry heist in US history will not stand trial

    Officials say a 'gap' was 'exposed'

    News
  • Amazon MGM Studios
    7 hours ago

    Controversial Marvel star's new movie is already being called one of 2026's worst films

    It's been branded an AI abomination

    News
  • Bloomberg / Contributor / Getty
    8 hours ago

    Sam Altman slams Tesla as Elon Musk warns 'do not let your loved ones use ChatGPT'

    A war of the words between the billionaires

    News
  • Chip Somodevilla / Getty Images
    9 hours ago

    Trump suggests Greenland could become part of Golden Dome as part of mysterious new deal

    Enemy missiles don't stand a chance against his Golden Dome

    News