Trump Signed a Bipartisan Deepfake ‘Revenge Porn’ Bill, Which Claims to Offer Victims Greater Protections
The Take It Down Act criminalizes distribution of nonconsensual intimate imagery, including through deepfake “revenge porn.” But some advocates warn the bill is ripe for free speech abuses.
President Donald Trump, seated next to first lady Melania Trump and joined by lawmakers and victims of AI deepfakes and revenge porn, holds a copy of the Take It Down Act during a signing ceremony in the Rose Garden of the White House on May 19. Photo: Getty Images
On Monday, President Trump signed the bipartisan Take It Down Act, which criminalizes the distribution of nonconsensual intimate imagery, such as AI-generated, deepfake “revenge porn.” The new law will render the distribution of this material a federal crime punishable with up to three years in prison for pornographic images involving minors and two years for images involving adults. It also requires online platforms to establish a request-and-removal system allowing victims to have such photos of themselves taken down within 48 hours. The Take It Down Act’s criminal provisions will take effect immediately, but media platforms will have one year to set up the required request-and-removal systems.
The bill coasted through Congress with sweeping bipartisan support. In fact, Sen. Ted Cruz (R-Texas) emerged as a key champion of the bill, citing an underage constituent who was victimized by nonconsensual deepfake porn. First Lady Melania Trump has also strongly advocated for the bill—as have Sen. Amy Klobuchar (D-Minn.), Rep. Madeleine Dean (D-Penn.), and Rep. María Elvira Salazar (R-Fla.)
Cyber sexual abuse content—frequently, problematically called “revenge porn”—has presented a massive threat to especially women, children, and abuse victims for years now. The relatively recent rise in the dissemination of AI-generated nude or sexual images of real people has been especially alarming; it can affect victims’ employment or render them vulnerable to traumatizing sexual harassment. While most states have adopted varying anti-cyber-exploitation laws in recent years to rein in nonconsensual nude images of individuals shared by former partners, before the Take It Down Act, not many states had laws to specifically address deepfake content.
The legislation builds on existing federal code involving nonconsensual sexual images, defining these images as including “the uncovered genitals, pubic area, anus or post-pubescent female nipple of an identifiable individual” as well as graphic sexual intercourse, masturbation, and “graphic or simulated lascivious exhibition” of anuses, genitals, or the pubic area, the 19th notes.
But concerningly, the law states that deepfake, sexual abuse imagery must be “indistinguishable from an authentic visual depiction” of an identifiable individual—a considerable threshold that could leave victims in the lurch. Trump, himself, has advocated for the law because he’s said it should be used to protect him: “Nobody is treated worse online than I am, nobody,” he said in support of the bill during his address to the joint session of Congress in March. The 19th notes that this comment from the president, who is notorious for his sweeping attacks on free speech and threats to his critics, has “made activists worry that the bill could be used to remove critical political speech, especially in the context of a wider crackdown by the current administration.”
Further, organizations like the Electronic Frontier Foundation have also raised concerns about the new law and its potential impacts on sex workers. The law’s 48-hour takedown requirement, the organization says, may not be enough time for online platforms to verify whether content is nonconsensual, leading them to quickly remove consensual sexual conduct, which could unfairly target and punish sex workers.
In April, when the bill first advanced in Congress, Evan Greer, director of the digital rights advocacy organization Fight for the Future, warned that the Take It Down Act “is a classic example of a bill that tries to solve a real problem, but would do more harm than good [because] lawmakers care more about SAYING they’re doing something than ACTUALLY doing something. As it stands, this bill threatens free expression.”
So… sentiment about the Take It Down Act in progressive spaces is pretty mixed at this time. Feminist groups have also been fairly quiet, which could *maybe* have something to do with Trump being a legally recognized sexual assailant and serially accused rapist, as well as a frequent, shameless purveyor of deepfake graphics, harmful AI slop, and online disinformation, himself.
Nevertheless, one thing’s for certain: AI-based, deepfake sexual abuse is a huge issue, and thanks to the widespread availability of predatory, creepy apps, it’s never been easier to generate this material. Twitch streamers, celebrities like Emma Watson, abuse victims, and children have all already fallen victim. We do need action from lawmakers—but that action should take into account the concerns of digital rights and free speech advocates.
For now, keep resources like the Cyber Civil Rights Initiative—an advocacy organization that combats online abuse by offering a free, 24/7 hotline to victims of image-based sexual abuse at 1-844-878-2274—in your back pocket. CCRI also provides a list of contacts to reach out to if you’re trying to remove such images from major search engines, social media platforms, dating services, and pornography sites.
If you or someone you know are experiencing sexual abuse or domestic violence and seeking support, you can reach the National Domestic Violence Hotline online or at their free 24/7 phone line at 1-800-799-7233.