Like what you just read? You’ve got great taste. Subscribe to Jezebel, and for $5 a month or $50 a year, you’ll get access to a bunch of subscriber benefits, including getting to read the next article (and all the ones after that) ad-free. Plus, you’ll be supporting independent journalism—which, can you even imagine not supporting independent journalism in times like these? Yikes.
Trump Signed a Bipartisan Deepfake ‘Revenge Porn’ Bill, Which Claims to Offer Victims Greater Protections
The Take It Down Act criminalizes distribution of nonconsensual intimate imagery, including through deepfake “revenge porn.” But some advocates warn the bill is ripe for free speech abuses.
LatestPolitics
On Monday, President Trump signed the bipartisan Take It Down Act, which criminalizes the distribution of nonconsensual intimate imagery, such as AI-generated, deepfake “revenge porn.” The new law will render the distribution of this material a federal crime punishable with up to three years in prison for pornographic images involving minors and two years for images involving adults. It also requires online platforms to establish a request-and-removal system allowing victims to have such photos of themselves taken down within 48 hours. The Take It Down Act’s criminal provisions will take effect immediately, but media platforms will have one year to set up the required request-and-removal systems.
The bill coasted through Congress with sweeping bipartisan support. In fact, Sen. Ted Cruz (R-Texas) emerged as a key champion of the bill, citing an underage constituent who was victimized by nonconsensual deepfake porn. First Lady Melania Trump has also strongly advocated for the bill—as have Sen. Amy Klobuchar (D-Minn.), Rep. Madeleine Dean (D-Penn.), and Rep. María Elvira Salazar (R-Fla.)
Cyber sexual abuse content—frequently, problematically called “revenge porn”—has presented a massive threat to especially women, children, and abuse victims for years now. The relatively recent rise in the dissemination of AI-generated nude or sexual images of real people has been especially alarming; it can affect victims’ employment or render them vulnerable to traumatizing sexual harassment. While most states have adopted varying anti-cyber-exploitation laws in recent years to rein in nonconsensual nude images of individuals shared by former partners, before the Take It Down Act, not many states had laws to specifically address deepfake content.
The legislation builds on existing federal code involving nonconsensual sexual images, defining these images as including “the uncovered genitals, pubic area, anus or post-pubescent female nipple of an identifiable individual” as well as graphic sexual intercourse, masturbation, and “graphic or simulated lascivious exhibition” of anuses, genitals, or the pubic area, the 19th notes.