• News
    • Tech News
    • AI
  • Gadgets
    • Apple
    • iPhone
  • Gaming
    • Playstation
    • Xbox
  • Science
    • News
    • Space
  • Streaming
    • Netflix
  • Vehicles
    • Car News
  • Social Media
    • WhatsApp
    • YouTube
  • Advertise
  • Terms
  • Privacy & Cookies
  • LADbible Group
  • LADbible
  • UNILAD
  • SPORTbible
  • GAMINGbible
  • Tyla
  • FOODbible
  • License Our Content
  • About Us & Contact
  • Jobs
  • Latest
  • Topics A-Z
  • Authors
Facebook
Instagram
X
TikTok
Snapchat
WhatsApp
Submit Your Content
Man who posted deepfake images of prominent women faces brutal fine that could change his life

Home> News> AI

Published 11:39 30 May 2025 GMT+1

Man who posted deepfake images of prominent women faces brutal fine that could change his life

Australian courts have called for the maximum penalty fine

Harry Boulton

Harry Boulton

Featured Image Credit: Westend61 via Getty
AI
Money
World News

Advert

Advert

Advert

One Australian man could face a live-changing fine in a first-of-its-kind case, as he is charged with the creation and distribution of AI deepfake images involving multiple prominent women.

It's never been easier to create fake pictures thanks to the rise of AI, and ChatGPT only recently revealed record-breaking numbers after introducing their newly-improved image generation tools.

While most of the conversations surrounding these rapid enhancements have involved people's job security, perhaps one under-discussed danger circulates the growing potential for similar tools to produce exploitative sexual images of people without their consent.

Known more commonly as 'deepfakes', these images would previously supplant the face of a famous individual or someone that the creator knew onto a sexual (typically pornographic) image to make it appear as if it was the supplanted person in the image or video.

Advert

Deepfakes create fake - and typically pornographic - images of people without their consent (Getty Stock)
Deepfakes create fake - and typically pornographic - images of people without their consent (Getty Stock)

With advancements in AI though people can create entirely fictional and potentially convincing deepfakes that breach the consent and cause significant harm to the affected party - and it's growing harder and harder to distinguish between real and fake.

Deepfakes are now at the heart of a landmark legal case in Australian courts, as Anthony Rotondo faces a fine of up to $450,000 after failing to remove intimate AI deepfakes of several prominent Australian women from a pornography site, as reported by The Guardian.

The eSafety Commissioner - Australia's independent regulator for online safety - has pushed for courts to consider the maximum penalty fine, and has outlined the significant harm that Rotondo's actions have caused.

Advert

"The penalty will deter others from engaging in such harmful conduct," outlined a spokesperson for the online safety regulator, adding that it needs to reflect the seriousness of Rotondo's case "and the significant impacts on the women targeted."

It was initially difficult for Australian officials to do anything about Rotondo's actions, as he refused to comply with their demands to take down the images as he was at that point based in the Philippines.

However, shortly after he returned to Australia the case against Rotondo was launched, and he was also fined for contempt of court for failing to take down the images.

"Deepfake image based abuse is not only becoming more prevalent but is also very gendered and incredibly distressing to the victim-survivor," outlined eSafety commissioner Julie Inman Grant during the court case.

Advert

Deepfake production has dramatically increased in the last six years, and the majority involve sexual depictions of women and girls (Getty Stock)
Deepfake production has dramatically increased in the last six years, and the majority involve sexual depictions of women and girls (Getty Stock)

She illustrated that the production of deepfake material has increased by 550% since 2019 - likely due to the aforementioned advancements in AI technology - and pornographic videos make by 99% of the deepfakes that are being produced. Additionally, within these videos, 99% of the imagery produced is of women and girls.

There have previously been reports of girls as young as 12-years-old being 'traumatized' by deepfakes produced by school bullies, and it's a danger that manifests far beyond what you might initially expect.

"Shockingly, thousands of open-source AI apps like these have proliferated online and are often free and easy to use by anyone with a smartphone," Inman Grant continued. "So these apps make it simple and cost-free for the perpetrator, while the cost to the target is one of lingering and incalculable devastation."

Advert

US President Donald Trump has just passed the Take It Down Act, which officially criminalizes the spread of deepfakes and nonconsensual intimate images (NCII) on social media, but online safety experts have expressed concerns that this law isn't enough, and could end up doing more harm than good for victim-survivors of deepfake-related crimes.

Choose your content:

2 days ago
  • Smithsonian Institution
    2 days ago

    'Zombie' volcano believed to be extinct for 710,000 years now showing signs of life

    Back from the dead with a vengeance

    Science
  • NurPhoto / Contributor / Getty
    2 days ago

    Spotify axes controversial podcast after host 'violated hate speech rules'

    He hasn't yet been banned

    News
  • Shirlaine Forrest/Getty Images
    2 days ago

    Netflix star issues message to MrBeast after claims YouTube star is opening his own bank

    The real life 'Bank of Dave' has offered his advice to MrBeast

    News
  • Anna Moneymaker/Getty Images
    2 days ago

    Top army general makes shocking admission about how he's using ChatGPT

    AI could have a more important role than you'd expect

    News
  • Influencer trapped in one of the most remote places on Earth faces brutal fine as legal fate is revealed
  • Phone company unveils futuristic new feature that could change the future of travel forever
  • Groundbreaking 'super vaccine' that could stop cancer in its tracks under development
  • Archaeologists make shocking discovery at 'city of Apostolic wonders' that could confirm biblical history