

Despite supporting AI advancement for most of his second term as president, Donald Trump has now signed in a new landmark law that halts the 'rise of AI' in one key area.
Despite optimism towards artificial intelligence in the technology world, one of the biggest black marks attributed to the software is its potential to create illegal and nonconsensual images of people, especially now that image generation is far more capable than what was possible even a year ago.
Known officially as nonconsensual intimate images (NCII) or more commonly as 'AI deepfakes', this process involves creating fake images that are typically sexually explicit of individuals without their consent.
Advert
These images are then shared on social media and potentially even on black market sites, and with improvements to generative AI performance will appear near indistinguishable from the 'real thing', leading to claims of consent breaches, sexual exploitation, and harassment.
One major step has now been made by the United States government to fight back against AI deepfakes though, as President Trump has now signed the 'Take It Down Act' into law, as reported by The Verge.
This new act not only criminalizes the distribution of NCII - with a potential punishment of three years in prison alongside a fine - but also requires social media platforms to take down these images once a report has been submitted, alongside any potential copies that have been made.
Advert
While this is, in premise, a fantastic new law that fulfils what many civil rights and feminist organizations have been requesting for years, experts have pointed out some potentially major holes that might allow it do 'more harm than good'.
Mary Anne Franks, president of the Cyber Civil Rights Initiative (CCRI), has been one of the more critical voices against the Take It Down Act despite broadly supporting what it claims to work towards.
She outlined on Bluesky that it "includes a poison pill," highlighting that the act "lacks adequate safeguards against false reports, is over- and under-inclusive, and gives false hope to victims."
A lengthy statement from the Cyber Civil Rights Initiative delves further into the potentially dangerous aspects of the Take It Down Act, drawing attention to a major 'loophole' that is now opened up.
Advert
"CCRI must, however, note its objection to the exception provided for a 'person who possesses or publishes an intimate visual depection of himself or herself,' which creates a dangerous loophole that would seemingly allow a person to disclose intimate images without consent so long as that person also appears in the image."
In addition, it also worries about the potential of false reports in two different ways. Firstly, the act implies that sites might have be obligated to remove sexually explicit content that doesn't fall in line with NCII, and then there is also the fear that sites will be overburdened with false reports, so actual cases of nonconsensual images will be missed over or take longer to be dealt with.
There remains also the danger that sites won't be pressured enough to comply with this ruling, especially "platforms that are closely aligned with the current administration," which potentially include Elon Musk's X and Mark Zuckerberg's Facebook and Instagram.
Finally, many also have a general worry that the act will serve as a widespread tool to remove or censor any content that the current administration disagrees with, and the suggestion of such came from the president's own mouth.
Advert
"I'm going to use that bill for myself too, if you don't mind" decried President Trump when signing the Take It Down Act, "because nobody gets treated worse than I do online. Nobody."
One comment on Reddit remarks that "the Patriot Act was enacted to protect America from terrorists. The Take It Down law was created to protect the President's feelings."
Another adds that the new law is "censorship, plain and simple," claiming that President Trump "wants to control all media and news. Welcome to dictatorship."