uniladtech homepage
  • News
    • Tech News
    • AI
  • Gadgets
    • Apple
    • iPhone
  • Gaming
    • Playstation
    • Xbox
  • Science
    • News
    • Space
  • Streaming
    • Netflix
  • Vehicles
    • Car News
  • Social Media
    • WhatsApp
    • YouTube
  • Advertise
  • Terms
  • Privacy & Cookies
  • LADbible Group
  • LADbible
  • UNILAD
  • SPORTbible
  • GAMINGbible
  • Tyla
  • FOODbible
  • License Our Content
  • About Us & Contact
  • Jobs
  • Latest
  • Archive
  • Topics A-Z
  • Authors
Facebook
Instagram
X
TikTok
Snapchat
WhatsApp
Submit Your Content
12-year-old girl left 'traumatized' after a deepfake image is shared by school bullies on social media

Home> News

Published 16:32 30 Jul 2024 GMT+1

12-year-old girl left 'traumatized' after a deepfake image is shared by school bullies on social media

Police have apologized for "mistakes" made while investigating the deepfake image shared by school bullies

Rikki Loftus

Rikki Loftus

google discoverFollow us on Google Discover
Featured Image Credit: Fiordaliso/Userba011d64_201/Getty
Social Media
Snapchat
News

Advert

Advert

Advert

A 12-year-old girl has been left “traumatized” after a deepfake image was shared by school bullies on social media.

The family of the young girl have accused the police of not doing enough to protect their daughter after bullies posted a deepfake image of her online.

Deepfakes are synthetic media that digitally manipulate an image to replace one person's likeness convincingly with that of another.

The young girl was left traumatized after the image circulated online (Fiordaliso/Getty)
The young girl was left traumatized after the image circulated online (Fiordaliso/Getty)

Advert

Deepfake porn has become an increasing problem since the birth of the technology.

The edited photo was shared on Snapchat and West Yorkshire Police in the UK initially told the family that nothing could be done because the social media platform is based in the US.

The parents of the 12-year-old got in touch with police using the non-emergency number earlier this year and finding out that the image was circulating online.

An officer visited their home but nine days later, the family received a text to say that the case had been closed and no suspect was identified.

Deepfake technology allows users to manipulate media (Userba011d64_201/Getty)
Deepfake technology allows users to manipulate media (Userba011d64_201/Getty)

After a complaint, the police have now admitted that it made “mistakes” and have since apologized to the family.

They say that the incident is now being “thoroughly investigated”.

Speaking to the BBC, the girl’s mother said: “It has honestly been the most horrendous thing to go through.

“This image is being shared by children who presumably think it is funny but it is basically child pornography.

“How can someone make a fake pornographic picture of a 12-year-old girl for people to share again and again - and police do nothing at all?”

She went on to add: “It just felt to me that they [the police] just weren't bothered at all.

Deepfakes replace one person's likeness convincingly with that of another (Tero Vesalainen/Getty)
Deepfakes replace one person's likeness convincingly with that of another (Tero Vesalainen/Getty)

“We called back several times to find out why they weren't investigating and my husband went to the police station.

“At one point we were told the log said they didn't have our phone number - but we'd been sent that text message.

“We thought at least they might contact my daughter's school before the summer holidays to try and stop the image being spread further, but they didn't and now it is all over her social media.”

A West Yorkshire Police spokesperson said: “We acknowledge that this matter was not handled in a satisfactory manner and our method of communication does not reflect the appropriate level of victim care.

The incident is now being throughly investigated (Userba011d64_201/Getty)
The incident is now being throughly investigated (Userba011d64_201/Getty)

“The officer in the case has been advised accordingly and we have since spoken with the victim’s family to assure them this is being thoroughly investigated.

“Further information about this matter has since come to light. Our inquiries remain ongoing.”

A spokesperson for Snapchat said: “Any activity that involves the sexual exploitation of a child, including the sharing of explicit deepfake images, is abhorrent and illegal, and we have zero tolerance for it on Snapchat.

“If we find this content through our proactive detection technology, or if it is reported to us, we will remove it immediately and take appropriate action. We also work with the police, safety experts and NGOs to support any investigations.”

Choose your content:

6 hours ago
7 hours ago
  • Bloomberg / Contributor via Getty
    6 hours ago

    How Trump's digital footprint is being used to push for his removal as Dems file bill to remove president

    It comes after the POTUS was blasted for posting an AI picture of himself as 'Jesus Christ'

    News
  • JIM WATSON / Contributor via Getty
    6 hours ago

    Trump responds as Mamdani sends multi-million dollar tax bill to the White House

    The President of the United States could be about to foot a massive bill

    News
  • Noam Galai / Stringer / Getty
    7 hours ago

    Monzo founder reveals two jobs that will seem like a 'joke' in a matter of years thanks to AI

    These careers will seemingly be eradicated by advancements in the tech

    News
  • PonyWang/Getty Images
    7 hours ago

    Simulation shows what's going on in the body during female ejaculation

    Female ejaculation has been shrouded in mystery for decades

    Science
  • NASA responds after social media users notice bizarre missing part on Artemis 2's heat shield
  • Bizarre social media post that left woman uninvited from friend's wedding after spending $2150
  • Pope Leo promises 'an active social media presence' after old account was caught criticizing Trump
  • People spot hidden meaning in Trump's AI Jesus image as they highlight 'evil' figure