
In 2025, we're already relying on artificial intelligence more than ever. While there's no denying the benefits of the likes of OpenAI and ChatGPT, there are obvious fears about whether these chatbots are in danger of eradicating the human race.
More than just artificial intelligence continuing to come for our jobs, more serious concerns suggest it could wipe us off the face of the planet in just two years. The so-called 'Godfathers of AI' have already warned us about AI being used for nefarious means in the next American election, while there are also red flags about it having dangerous behavior, including "tendencies toward self-preservation and deception."
Still, there's been a massive uptick in people turning to AI in the hour of need. Whether it be leaning on it for emotional support or taking it on as a romantic partner, one AI expert has highlighted why we shouldn't trust it with our darkest secrets.

Advert
Speaking to Mail Online, Oxford University computer science professor Mike Wooldridge said it would be 'very dangerous' to trust AI.
Open AI's Sam Altman recently hyped a 'killer' feature where ChatGPT develops its own memory of your previous chats, but not everyone might be happy with that. After we covered how AI admitted it could blackmail us if threatened, this could be just the start of the problems when spilling all to your BFF chatbot.
Saying that anything you tell ChatGPT will be stored to train future versions of itself, Wooldrige explained how it only "tells you what you want to hear." Pointing out further issues, he added: "It has no empathy. It has no sympathy. That's absolutely not what the technology is doing and crucially, it's never experienced anything.

Advert
"The technology is basically designed to try to tell you what you want to hear – that's literally all it's doing."
Whereas humans are trained to look for consciousness, it's something we attribute 'far too often'.
Wooldrige continued: "You should assume that anything you type into ChatGPT is just going to be fed directly into future versions of ChatGPT."
It would also be "extremely unwise to start having personal conversations or complaining about your relationship with your boss, or expressing your political opinions."
Advert
Because of how AI models work, Wooldridge claims it's almost impossible to get your data back once it's been typed into ChatGPT. This means you'll want to think twice before slagging off a significant other or making your true feelings on the current political climate known.
OpenAI had to deal with an issue earlier in 2025 where a ChatGPT bug meant some people were able to see certain parts of other users' chat history.
Still, OpenAI promises that histories are only kept for 30 days and aren't used to train ChatGPT.