To make sure you never miss out on your favourite NEW stories, we're happy to send you some reminders

Click 'OK' then 'Allow' to enable notifications

X blocks searches for Taylor Swift after explicit AI images of her go viral

X blocks searches for Taylor Swift after explicit AI images of her go viral

Explicit deepfake images have been flooding the social media platform recently.

If you search for 'Taylor Swift' on X right now, you won't get any results.

That's because Elon Musk's social media platform (formerly known as Twitter) has temporarily blocked search terms linked to the pop star after an explosion in AI-generated explicit images - often called deepfakes.

Joe Benarroch, head of business operations at X, told the Wall Street Journal this would only be a "temporary action", one that has apparently been made out of "an abundance of caution".


That doesn't actually give us any clarity on how long the search block will last, while another statement from the network's Safety account has indicated that X is taking proactive steps to detect and remove the posts containing such images - although it didn't mention Swift by name.

The graphic, fake images of Swift began to appear online last week. This sparked fans to flood the platform with real images of the star, while also lobbying X to take stronger action against the images and the accounts accused of spreading them.

Since Elon Musk took over Twitter back in 2022, he's been criticized for cutting back the platform's content moderation teams, and pushing to allow more posts to remain on the site because of his belief in “absolute free speech”.

Meanwhile, the huge rise in interest in AI over the last couple of years has been matched by increasingly widespread access to image and text-generation tools.

Most of these tools have clear segments in their terms of service which forbid the generation of images like those of Swift - and most programs will block the images from being created as the attempt happens, in theory.

Gotham / Contributor / Getty

However, there are workarounds - leading to concern from experts that this is the start of a tidal wave of false imagery that platforms need to be ready for.

This episode is already attracting political attention, too, as demonstrated by a statement on X from Democratic Congressman Joe Morelle.

He said: "The deepfake images made of Taylor Swift are abhorrent. And deepfakes don't just happen to celebrities - they're happening to women and girls everywhere, every day. We need to put a stop to this by passing my legislation, the Preventing Deepfakes of Intimate Images Act."

With the sharing of deepfake pornography is already illegal in the UK, this suggests that we could soon see such a policy come into effect in the US, too. That said, Morelle's act was introduced back in May 2023, and hasn't progressed since then - sometimes these processes can move mighty slowly.

Featured Image Credit: Axelle/Bauer-Griffin / Contributor / NurPhoto / Contributor / Getty