Man who posted deepfake images of prominent women faces brutal fine that could change his life

Home> News> AI

Man who posted deepfake images of prominent women faces brutal fine that could change his life

Australian courts have called for the maximum penalty fine

One Australian man could face a live-changing fine in a first-of-its-kind case, as he is charged with the creation and distribution of AI deepfake images involving multiple prominent women.

It's never been easier to create fake pictures thanks to the rise of AI, and ChatGPT only recently revealed record-breaking numbers after introducing their newly-improved image generation tools.

While most of the conversations surrounding these rapid enhancements have involved people's job security, perhaps one under-discussed danger circulates the growing potential for similar tools to produce exploitative sexual images of people without their consent.

Known more commonly as 'deepfakes', these images would previously supplant the face of a famous individual or someone that the creator knew onto a sexual (typically pornographic) image to make it appear as if it was the supplanted person in the image or video.

Deepfakes create fake - and typically pornographic - images of people without their consent (Getty Stock)
Deepfakes create fake - and typically pornographic - images of people without their consent (Getty Stock)

With advancements in AI though people can create entirely fictional and potentially convincing deepfakes that breach the consent and cause significant harm to the affected party - and it's growing harder and harder to distinguish between real and fake.

Deepfakes are now at the heart of a landmark legal case in Australian courts, as Anthony Rotondo faces a fine of up to $450,000 after failing to remove intimate AI deepfakes of several prominent Australian women from a pornography site, as reported by The Guardian.

The eSafety Commissioner - Australia's independent regulator for online safety - has pushed for courts to consider the maximum penalty fine, and has outlined the significant harm that Rotondo's actions have caused.

"The penalty will deter others from engaging in such harmful conduct," outlined a spokesperson for the online safety regulator, adding that it needs to reflect the seriousness of Rotondo's case "and the significant impacts on the women targeted."

It was initially difficult for Australian officials to do anything about Rotondo's actions, as he refused to comply with their demands to take down the images as he was at that point based in the Philippines.

However, shortly after he returned to Australia the case against Rotondo was launched, and he was also fined for contempt of court for failing to take down the images.

"Deepfake image based abuse is not only becoming more prevalent but is also very gendered and incredibly distressing to the victim-survivor," outlined eSafety commissioner Julie Inman Grant during the court case.

Deepfake production has dramatically increased in the last six years, and the majority involve sexual depictions of women and girls (Getty Stock)
Deepfake production has dramatically increased in the last six years, and the majority involve sexual depictions of women and girls (Getty Stock)

She illustrated that the production of deepfake material has increased by 550% since 2019 - likely due to the aforementioned advancements in AI technology - and pornographic videos make by 99% of the deepfakes that are being produced. Additionally, within these videos, 99% of the imagery produced is of women and girls.

There have previously been reports of girls as young as 12-years-old being 'traumatized' by deepfakes produced by school bullies, and it's a danger that manifests far beyond what you might initially expect.

"Shockingly, thousands of open-source AI apps like these have proliferated online and are often free and easy to use by anyone with a smartphone," Inman Grant continued. "So these apps make it simple and cost-free for the perpetrator, while the cost to the target is one of lingering and incalculable devastation."

US President Donald Trump has just passed the Take It Down Act, which officially criminalizes the spread of deepfakes and nonconsensual intimate images (NCII) on social media, but online safety experts have expressed concerns that this law isn't enough, and could end up doing more harm than good for victim-survivors of deepfake-related crimes.

Featured Image Credit: Westend61 via Getty