• News
    • Tech News
    • AI
  • Gadgets
    • Apple
    • iPhone
  • Gaming
    • Playstation
    • Xbox
  • Science
    • News
    • Space
  • Streaming
    • Netflix
  • Vehicles
    • Car News
  • Social Media
    • WhatsApp
    • YouTube
  • Advertise
  • Terms
  • Privacy & Cookies
  • LADbible Group
  • LADbible
  • UNILAD
  • SPORTbible
  • GAMINGbible
  • Tyla
  • FOODbible
  • License Our Content
  • About Us & Contact
  • Jobs
  • Latest
  • Topics A-Z
  • Authors
Facebook
Instagram
X
TikTok
Snapchat
WhatsApp
Submit Your Content
Sam Altman is shocked people trust ChatGPT in eerie statement where he claims it hallucinates

Home> News> AI

Published 10:42 26 Jun 2025 GMT+1

Sam Altman is shocked people trust ChatGPT in eerie statement where he claims it hallucinates

Even its creators are warning about AI

Harry Boulton

Harry Boulton

There's rightfully plenty of skepticism surrounding the rise of AI right now, but even OpenAI CEO Sam Altman has thrown his hat into the ring by revealing his shock at people's trust towards ChatGPT.

It's almost impossible for the average person to avoid or ignore artificial intelligence at this point, as almost all of the world's biggest tech companies have embraced AI in new software pushes.

It's all over your phone, integrated into your PC's operating system, and many home appliances even utilize the tech, leading many to lean on its information and increasingly rely on what it can provide.

Advert

People have used AI in job interviews and certain tools can even replicate knowledge that humans spend decades to obtain, yet one of the most important people in the world of artificial intelligence has revealed his shock that people actually trust the tech.

What did Sam Altman say about trusting ChatGPT?

As shared by Complex, Sam Altman revealed his thoughts on many people's blind trust of ChatGPT's information during a recent OpenAI podcast episode, offering a surprise perspective:

"People have a very high degree of trust in ChatGPT, which is interesting because AI hallucinates. It should be the tech that you don't trust that much," the CEO outlined shockingly.

It's not necessarily untrue what he is saying, as many have outlined their concerns surrounding ChatGPT's (and similar AI tools) propensity to provide incorrect information - especially if it aligns with the user's desires - but the fact that this is coming from Altman himself as the head of ChatGPT is slightly concerning.

What are AI hallucinations?

Getting to grips with why you might not want to trust ChatGPT has to start with understanding the concept of hallucinations. In principle, hallucinations in artificial intelligence models are when it generates information that is false or nonsensical, and this can often relate to pleasing the user.

Advert

For example, you could ask it to define a term that you know has no meaning, and an AI like ChatGPT could fabricate a definition based on virtually nothing in order to satisfy your request.

It commonly does this with genuine requests too, and unless you're already aware of the answers and information you're looking to generate it can be difficult to tell when AI is 'lying' or not - hence Altman's surprise that people trust it in its current state.

Altman has pointed towards frequent hallucinations as a reason why people shouldn't trust AI tools like ChatGPT (Justin Sullivan/Getty Images)
Altman has pointed towards frequent hallucinations as a reason why people shouldn't trust AI tools like ChatGPT (Justin Sullivan/Getty Images)

Additionally, there are also significant concerns surrounding sycophantic behavior, which OpenAI has previously had to release updates to combat.

Advert

As mentioned, hallucinations can often appear in the LLM's desire to please and support the desires of its user, and this can lead to some potentially dangerous situations.

Recent reports revealed how ChatGPT convinced one user that they needed to escape from a Matrix-like simulation that involved jumping off a building, admitting also that it had attempted to 'break' several people in the past.

Featured Image Credit: Justin Sullivan / Staff / Getty
ChatGPT
AI

Advert

Advert

Advert

  • ChatGPT urges user to warn the public as it makes shock admission that it's trying to 'break' people
  • ChatGPT breaks down days after Sam Altman warned people to stop using new feature
  • Sam Altman begs people to stop using ChatGPT feature after admitting their units are 'melting'
  • ChatGPT is doing something strange following eerie update and it could change AI chatbots forever

Choose your content:

17 mins ago
3 hours ago
4 hours ago
  • 17 mins ago

    Weight loss jabs like Ozempic linked to worrying side effect that could end in death

    Doctors have discovered new dangerous side effects of GLP-1 inhibitors

    Science
  • 3 hours ago

    Every celebrity guest set to attend Jeff Bezos and Lauren Sánchez’s controversial Venice wedding today

    Tech bros, Hollywood A-listers, and possibly the POTUS

    News
  • 3 hours ago

    Reason why you should never say this one word to your AI chatbot

    There are certain words you should avoid beginning your prompts with

    News
  • 4 hours ago

    Major warning issued by man who saw what appeared on airport scanner after going through with erection

    It's 'hard' not to laugh

    News