


Remember when we all used to think AI was just a bit of fun where we could turn ourselves into Studio Ghibli characters, and the only real fear was that students were using it to cheat on their essays?
We've seen AI come on leaps and bounds in the past few years, with OpenAI's introduction of ChatGPT in November 2022 being seen as something of an 'inventing the wheel' moment for the industry.
There's been a slow erosion in trust about where AI is heading, but if it's not fears about humans dating machines, it's the fact that AI could take our jobs, trigger World War III, or even be the last thing the human race invents.
While the latter few are obvious doomsday scenarios, there's a more immediate fear that your chat logs could land you behind bars. Back in 2025, we covered how OpenAI's Sam Altman warned that asking ChatGPT certain questions could lead to legal troubles. Back then, the tech mogul reiterated that while chat logs are supposed to be private between a user and an LLM, the company was still struggling to work out 'legal privilege' that you might be used to from a flesh-and-blood therapist, lawyer, or doctor. Considering many use ChatGPT as a therapist, lawyer, or doctor, it immediately rang alarm bells.
Advert

Basically, if you ask ChatGPT how to run your crime empire or the best way to dispose of a body, and soon find yourself in the dock after your spouse mysteriously vanishes, expect ChatGPT to be a pretty damning witness.
As reported by the New York Post, US lawyers are reminding the general public that AI chatbots aren't trusted allies in cases of wrongdoing. The warning has been triggered after a New York judge ruled that the former CEO of a financial services company couldn't keep his chat logs away from prosecutors. Bradley Heppner was the former chair of a bankrupt financial services company called GWG Holdings, as well as the founder of alternative asset firm Beneficent.
It's not just ChatGPT, with lawyers reiterating that everything from Anthropic's Claude to Google's Gemini could become unlikely witnesses when your chat logs are brought into a court of law.
As federal prosecutors pushed to charge Heppner with securities and wire fraud in November 2025, he pleaded not guilty but hit a stumbling block when he used Claude to prepare reports for his defense.
Prosecutors demanded that Heppner's attorneys hand over his chat logs because they weren't directly involved, with Judge Jed Rakoff ruling in February 2026 that Heppner must release 31 documents generated by Claude.

Unlike conversations with your lawyer almost always considered confidential under US law, you need to remember that chatbots aren't lawyers.
At the time, Rakoff wrote that an attorney-client relationship doesn't exist, “or could exist, between an AI user and a platform such as Claude.”
This was hammered home by Alexandria Gutiérrez Swette, a lawyer at New York's Kobre & Kim, who stated: "We are telling our clients: You should proceed with caution here."
It comes as attorneys encourage clients to keep their chats with chatbots a little more private.
The outlet says that over a dozen US law firms have placed warnings on their websites, while some, like the New York-based Sher Tremonte, have added it to hiring agreements. One recent client contract stated that sharing a lawyer's advice or communication with a chatbot would likely erase your attorney-client privilege.
Los Angeles-based O’Melveny & Myers and other firms are looking at 'closed' AI systems that could be used to protect legal communications, although the idea is yet to be tested on a wider scale.
A notice on Debevoise & Plimpton's website suggests that anyone turning to AI for legal help should only do so at the advice of a lawyer, potentially starting with a prompt like: "I am doing this research at the direction of counsel for X litigation."
Either way, mentions of clients using AI are becoming increasingly common as law firms are clearly getting twitchy about cases falling apart in a similar way to Heppner's.