President Donald Trump has signed the bipartisan Take It Down Act into law in an effort to combat non-consensual intimate imagery, including deepfakes and revenge porn.
First lady Melania Trump championed the bill as part of her renewed “Be Best” initiative. It passed both the House and Senate and was signed into law by her husband during a signing ceremony in the White House Rose Garden on May 19.
Introduced last year by Sens. Ted Cruz, R-Texas, and Amy Klobuchar, D-Minnesota, the Take It Down Act has received overwhelming bipartisan support. The bill passed the Senate in February and later the House with a 409-2 vote on April 28.
President Donald Trump is joined by first lady Melania Trump during the signing ceremony for the Take It Down Act in the Rose Garden of the White House on May 19, 2025 in Washington, DC.
While most states have laws protecting people from non-consensual intimate images and sexual deepfakes, the legislation varies in classification of crime and penalty. And victims have struggled to have images depicting them removed from websites, increasing the likelihood the images were continuously spread and the victims retraumatized.
“The Take It Down Act will protect victims of digital exploitation, hold internet platforms accountable by requiring them to remove such imagery from their platform and provide justice for victims by allowing prosecutors to go after those who publish nonconsensual explicit images online,” White House Press Secretary Karoline Leavitt told reporters.
Here’s what to know about the Take It Down Act and what it means.
What is the Take It Down Act?
The Take It Down Act criminalizes the publication of non-consensual intimate imagery, also known as NCII.
That includes AI-generated images, also referred to as deepfake revenge pornography. The law requires social media platforms and similar websites to remove revenge pornography content within 48 hours of notice from a victim.
The key provisions in the Take It Down Act include:
Criminalizing non-consensual intimate imagery by making it a federal crime to knowingly publish or share it on social media and other online platforms and clarifies that consent to create an image does not mean consent to share it. Non-consensual intimate imagery is defined to include realistic, computer-generated pornographic images and videos that depict identifiable, real people.
Requires websites and online platforms to take down non-consensual intimate imagery upon notice from the victim within 48 hours of the verified request. It requires that platforms must make reasonable efforts to remove copies of the images or reposts. (The Federal Trade Commission is charged with enforcement of this section.)
Protects free speech by targeting the “knowing publication” of non-consensual intimate imagery and requires that the computer-generated content meet a “reasonable person” test for appearing indistinguishable from an authentic image.
Allows medical professionals or law enforcement to participate in the digital forgery of an identifiable individual when acting “reasonably and in good faith.”
Why was the Take It Down Act introduced?
The Take It Down Act was introduced by Republican Sen. Ted Cruz of Texas and Democratic Sen. Amy Klobuchar of Minnesota in 2024. It received overwhelming support from both sides of the aisle after passing the House by a 409-2 vote on April 28, after having passed the Senate in February.
Cruz said the bill was inspired by Elliston Berry and her mother after the popular social media platform Snapchat refused to remove an AI-generated “deepfake” of the then 14-year-old for almost a year. He thanked the first lady and the bipartisan support “for locking arms in this critical mission to protect Americans from online exploitation.”
Klobuchar said in a statement after the bill’s passage that “we must provide victims of online abuse with the legal protections they need when intimate images are shared without their consent, especially now that deepfakes are creating horrifying new opportunities for abuse.”
“These images can ruin lives and reputations,” she continued. “But now that our bipartisan legislation is becoming law, victims will be able to have this material removed from social media platforms and law enforcement can hold perpetrators accountable.”
What are deepfakes?
Deepfakes are photos, videos or audio altered or created by AI to appear real, often without the subject of the media’s consent.
Many creator of deepfakes digitally place people into compromising situations, showing them appearing inappropriately or putting them in places that could spark controversy or embarrassment. The images have become a major cause for concern with the explosion of AI technology.
What is Melania Trump’s involvement?
Melania Trump heavily lobbied for the Take It Down Act, arguing that it protects individual privacy through strict ethical standards and robust security measures.
On March 3, the first lady called the consequences of non-consensual sexually explicit images “toxic” in her first public comments since her husband returned to the White House.
“It’s heartbreaking to witness young teens, especially girls, grappling with the overwhelming challenges posed by malicious online content, like deepfakes,” Melania Trump said on Capitol Hill during a rare public appearance.
During the first Trump administration, Melanie Trump advocated against cyberbullying through her “Be Best” campaign, which emphasized the “social, emotional, and physical health” of children through factors like social media and opioid abuse.
Contributing: Savannah Kuchar and Swapna Venugopal Ramaswamy, USA TODAY
This article originally appeared on USA TODAY: Take It Down Act now law: How it targets revenge porn, deepfakes