In a historic bipartisan move, the United States has criminalized the distribution of non-consensual intimate images—including AI-generated deepfakes. The newly enacted Take It Down Act, signed by President Donald Trump, aims to curb the spiraling abuse of artificial intelligence in digital pornography and strengthen protections for victims of online exploitation.
A Federal Shield Against Digital Exploitation
In a digital age increasingly shaped by the rise of artificial intelligence, the U.S. has taken a landmark step to protect individual privacy and dignity. President Donald Trump has signed into law the Take It Down Act, a sweeping federal bill that makes it a crime to “knowingly publish” or even threaten to publish sexually explicit content without the consent of the person depicted—including deepfakes generated by AI.
The legislation mandates that online platforms—including social media companies, websites, and file-sharing services remove such content within 48 hours of receiving a formal notice from the victim. The bill also compels these platforms to actively prevent re-uploads of the same content, a provision aimed at breaking the cycle of viral re-victimization.
“This is a monumental advancement in protecting the digital rights and personal autonomy of individuals,” said Senator Martha Reynolds (R-TX), one of the co-sponsors of the bill. “It sends a clear message: consent is not optional in the digital space.”
Also Read: Attention Startups! Showcase Your Smart Policing Solutions on India’s Biggest Stage
From Personal Tragedy to National Legislation
The bill’s passage was deeply influenced by the story of Elliston Berry, a 14-year-old girl whose AI-generated deepfake was circulated on Snapchat, leading to months of trauma. Despite repeated pleas, the platform failed to take down the video for nearly a year. Berry and her mother took their fight to Capitol Hill, meeting lawmakers face-to-face including members of the House Judiciary Committee to expose the urgent need for stronger protections.
First Lady Melania Trump, who personally lobbied for the bill in March, called the issue “heartbreaking.” Her involvement added a rare first-lady voice to legislative action, amplifying the voices of young women like Elliston and pushing the bill to the top of the Congressional agenda.
Tech giants like Meta (parent company of Facebook and Instagram) have come out in support of the act, stating that clear federal guidelines will enhance platform accountability and victim support mechanisms. “It’s time the law caught up with the technology,” said Meta’s Chief Policy Officer.
Deepfakes, AI, and the Rising Toll on Teens
The Take It Down Act emerges against the backdrop of a disturbing surge in AI-generated explicit content, often referred to as “deepfake porn.” New AI tools allow users to generate hyper-realistic images that digitally undress individuals or create false sexual content—without ever touching a camera. What once required professional editing software now takes minutes on a smartphone app.
The mental health consequences are staggering. Experts say victims especially teenage girls often experience harassment, bullying, self-harm, and in extreme cases, suicide. A wave of AI porn scandals has rocked high schools across multiple states, where students have used these tools to target classmates.
“It’s not just celebrities like Taylor Swift or public figures like Rep. Alexandria Ocasio-Cortez being targeted,” said Dr. Lisa Kramer, a digital ethics scholar. “Ordinary girls are being preyed upon by people they know—sometimes by classmates sitting next to them in class.”
With AI technology evolving rapidly and often outpacing regulation, the Take It Down Act offers a critical first line of defense. While many U.S. states have already implemented laws against revenge porn and deepfake dissemination, this is among the first major federal laws to impose strict obligations on internet platforms themselves.
A Template for the World?
While the U.S. takes this unprecedented step, many countries still lack coherent legal frameworks to address AI-generated sexual exploitation. Digital rights advocates argue that the Take It Down Act could serve as a model for global legislation, setting benchmarks on both criminal liability and corporate responsibility.
Still, challenges remain: identifying anonymous perpetrators, managing cross-border content hosting, and ensuring compliance by platforms that claim to be “neutral” intermediaries.
“This law is a significant first step, but enforcement, victim support, and international cooperation will be key to its long-term success.” said ACLU Legal Director Rachel Monroe.