To make sure you never miss out on your favourite NEW stories, we're happy to send you some reminders

Click 'OK' then 'Allow' to enable notifications

Cyber experts warn of information you should never tell AI and what can happen if you do

Cyber experts warn of information you should never tell AI and what can happen if you do

You might want to be wary of sharing quite so much of your personal info with chatbots.

Even if you're a ChatGPT power user, there are some lines you should make sure to never cross.

Experts have warned that giving some personal details to an AI service could be a catastrophic mistake.

In a new blog post, Stan Kaminsky from cybersecurity firm Kaspersky suggested that there are some types of information that you should definitely not feed into AI generation platforms.

Laurence Dutton / Getty

This included, according to Kaminksy: "Passwords, passport or bank card numbers, addresses, telephone numbers, names, or other personal data that belongs to you, your company, or your customers".

If any of this makes it into one of your prompts or conversations, you will have (perhaps not deliberately) put sensitive data into a system that is not obliged to keep it confidential.

This is because, if you read their terms of service really closely, you'll see that many chatbots and other AI products have the right to record your conversations and use them to potentially improve future responses.

It's even possible in some cases that a human will review your conversation to see how it went and whether the responses you got were appropriate.

Getting around this problem isn't all that hard, though, according to Kaminsky - just pick out any sensitive data in your request, then "replace these with asterisks or 'REDACTED' in your request".

This might mean that you have to do a little more work in the process of getting your requests ready, but given how bleeding-edge this technology is right now, you don't want to regret cutting corners later.


As well as the chatbots' own terms of service and data usage, there are also plenty of scams and cyberattacks popping up that hinge around getting hold of people's chat logs, so that's another reason to be wary.

It's not even all deliberate, either - there have been plenty of reports of bugs, where people find that the responses they get to their requests seem to be from entirely different conversations.

If those responses contained something like your credit card details then you can pretty easily imagine how messy things could get quickly.

Other tips that Kaminsky shared include never uploading a confidential document to any chatbot, even for help understanding it, and being really careful to review your privacy settings - making them as stringent as you can, where possible.

These will help you avoid any accidental leaks or bugs, and could be life-savers.

Featured Image Credit: AlexanderFord/Tatsiana Volkava/Getty