Trump Signs ‘Take It Down Act’ Targeting Deepfake and Revenge Porn

The bill was championed by First Lady Melania Trump, who said she met with survivors to better understand the situation and how children are impacted.

President Donald Trump signed a bill into law on May 19 that aims to mitigate the spread of nonconsensual intimate imagery and AI-generated deepfakes.

“With the rise of AI image generation, women have been harassed with deepfakes and other explicit images distributed against their will,” the president said during the event. “It’s just so horribly wrong, and it’s a very abusive situation … and today we’re making it totally illegal.”

The bipartisan legislation, championed by First Lady Melania Trump and known as the “Take It Down Act,” was signed in a Rose Garden ceremony at the White House.

Introduced by Sens. Ted Cruz (R-Texas) and Amy Klobuchar (D-Minn.), the bill gained momentum through advocacy from the first lady, who tied it to her Be Best initiative focused on child well-being.

Lawmakers introduced the bill to criminalize knowingly publishing or threatening to publish intimate images, including those artificially created, without consent. Offenders face up to three years in prison.

Websites and social media platforms are also required to remove such material within 48 hours of a victim’s request, with penalties for noncompliance.

The Senate passed the bill on Feb. 13, followed by the House on April 28, reflecting broad agreement on the need to address digital exploitation.

The first lady expressed pride in the bill’s passage while cautioning about the dangers that children face online.

“Artificial intelligence and social media are the digital candy for the next generation: sweet, addictive, and engineered to have an impact on the cognitive development of our children,” the first lady said. “But unlike sugar, these new technologies can be weaponized, shape beliefs, sadly affect emotions, and even be deadly.”

She said she met with families of survivors over the past few months to better understand the situation and how children are affected.

The legislation was inspired by cases such as that of Elliston Berry, a 14-year-old victim of an AI-generated deepfake, who attended the ceremony and whose story was highlighted during the bill’s development.

Berry and her mother met with Cruz, sharing their experience of Snapchat’s yearlong delay in removing the content. The new law addresses such delays by enforcing strict timelines for content removal.

Attended by lawmakers, advocates, and victims, the signing ceremony underscored the legislation’s broad support from both sides of the aisle.

Cruz called it “a major victory in the fight against digital exploitation,” while Klobuchar emphasized its role in “giving victims a voice.”

The president informed Berry of the partisan nature of national politics and congratulated her for helping bridge the divide.

“We’ve shown that bipartisanship is possible. It’s the first time I’ve seen such a level of bipartisanship, and it’s a beautiful thing to do,” Trump said. “I’m not sure you realize, honey, a lot of the Democrats and Republicans don’t get along so well. You’ve made them get along.”

Meta, which operates Facebook and Instagram, endorsed the bill.

“Having an intimate image—real or AI-generated—shared without consent can be devastating, and Meta supports efforts to prevent it,” Meta spokesman Andy Stone said in a statement.

While the law has garnered widespread support, some opponents said its language could be overly broad, potentially raising First Amendment concerns or leading to unintended censorship.

The president dismissed the objections and said the act protects the most vulnerable.

“People talked about all sorts of First Amendment, Second Amendment, any amendment they could make up, and we got it through,” he said.

Supporters, including the Information Technology and Innovation Foundation, a tech industry think tank, said in a statement that the bill “is an important step forward that will help people pursue justice when they are victims of nonconsensual intimate imagery, including deepfake images generated using AI.”

 

Leave a Reply