The landscape of deepfake regulation changed dramatically when President Trump signed the TAKE IT DOWN Act into law on May 19, 2025. As of May 2026, every major online platform is now legally required to comply — and this changes everything for victims of AI-generated non-consensual content.
What Is the TAKE IT DOWN Act?
The Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks (TAKE IT DOWN) Act is the first substantial federal law in the United States that directly criminalizes the publication of non-consensual intimate imagery, including AI-generated deepfakes.
Passed with nearly unanimous bipartisan support, the law establishes two critical pillars:
Criminal penalties for anyone who knowingly publishes intimate images — real or AI-generated — without the depicted person's consent. Violations carry up to two years in prison for crimes against adults and three years for crimes involving minors.
Platform obligations requiring covered online platforms to remove reported content within 48 hours of receiving a verified request. Platforms must also remove identical copies and take reasonable steps to prevent the content from being re-uploaded.
What This Means for Victims
If you've discovered AI-generated intimate content depicting you online, the TAKE IT DOWN Act gives you significantly more leverage than you had before:
1. Platforms Must Act Within 48 Hours
Previously, platforms could take weeks or months to respond to removal requests — if they responded at all. Now, any platform hosting user-generated content must remove reported non-consensual intimate imagery within 48 hours. This includes AI-generated deepfakes that are realistic enough that a reasonable person could mistake them for authentic.
2. Copies Must Also Be Removed
Platforms aren't just required to remove the specific reported content. They must also remove identical copies of the material and take reasonable steps to prevent it from being re-posted.
3. Criminal Prosecution Is Now Possible
For the first time at the federal level, creating and distributing deepfake intimate imagery is a criminal offense. This gives law enforcement a clear legal framework to pursue perpetrators.
4. The FTC Enforces Compliance
The Federal Trade Commission is tasked with enforcing platform compliance, treating violations as unfair or deceptive practices. This means platforms face real consequences for failing to act.
Key Exceptions to Know
The law includes important carve-outs. Content shared by the depicted person themselves is not covered. The law also exempts disclosures made for lawful law enforcement investigations, legal proceedings, medical treatment, and reporting unlawful conduct.
How to Use the TAKE IT DOWN Act
If you need to file a removal request under this law:
- Document everything — take screenshots, save URLs, and note timestamps before filing your report
- Use the platform's official reporting process — every covered platform must now have a clear, accessible complaint mechanism
- Verify your identity — platforms may require identity verification to process requests
- Track the 48-hour deadline — if the platform fails to act within 48 hours, they may be in violation of federal law
- Consider professional help — specialized removal services can coordinate multi-platform takedowns and ensure compliance
The Bigger Picture
The TAKE IT DOWN Act is a watershed moment, but it's not a complete solution. Content hosted on platforms outside US jurisdiction, encrypted messaging apps, and the dark web may fall outside its reach. This is why a comprehensive removal strategy — combining legal frameworks, platform reporting, and professional monitoring — remains essential.
At AIFakeRemoval, we've updated all our processes to leverage these new legal requirements. If you're dealing with non-consensual AI-generated content, you now have more tools than ever before. Don't wait — the sooner you act, the more effective removal efforts will be.