Authors
An important compliance deadline under the TAKE IT DOWN Act (Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act, Pub. L. No. 119–12) is less than two weeks away.
By May 19, 2026, every covered platform must have a functioning system in place to receive and act within 48 hours on requests to remove nonconsensual intimate images (NCII), including AI-generated digital forgeries (also known as “deepfakes”).
The Federal Trade Commission (FTC) is tasked with enforcing these provisions, and we’re hearing strong signals that the agency is gearing up for an immediate crackdown. If you’re not prepared, the risk isn’t hypothetical.
Here’s a look at who’s covered and what’s required.
What is a “Covered Platform”?
The requirement to create a NCII notification program applies to “covered platforms,” defined in the statute as any website, online service, or mobile application that:
- Serves the public; and
- Primarily provides a forum for user-generated content, including messages, videos, images, games, and audio files; or in the regular course of trade or business publishes, curates, hosts, or makes available content of NCII.
That test covers a lot of ground. Sites that routinely host nonconsensual intimate imagery are squarely within scope. But the second prong could also sweep in any widely available app or website that gives users a forum to publish content, well beyond the categories you might expect. Notably, coverage is made even broader by the fact that the statute grants the FTC jurisdiction over nonprofit organizations, a departure from the FTC Act’s general framework.
That said, the word “primarily” is where much of the interpretive action will occur. How do we figure out when a site or app “primarily provides a forum for user-generated content”? What about ecommerce resale sites that rely on sellers to upload content? Even with respect to major social media platforms, is there an argument that they “primarily” exist to generate advertising revenue, and not to provide a content-sharing forum? Where’s the line?
There are similar open questions around what it means to "serve[] the public." Internal company sites and proprietary apps likely aren’t covered. But what about platforms with age or geographic restrictions, or those requiring paid membership? If any member of the public can pay for access, does that qualify?
The FTC will need to resolve these questions through enforcement actions or guidance. Until it does, the lowest-risk approach is to implement a program if there is a reasonable argument that your site or app is covered.
What does the law require of a “covered platform”?
The statute is prescriptive. Covered platforms must do three things:
1. Establish a removal process. Platforms must establish a process by which individuals or their authorized representatives can submit a removal request if they believe that their images have been uploaded without consent. The process must require requests to include an electronic or physical signature, information sufficient to locate the content, a good-faith statement that the content is nonconsensual, and contact information for the person submitting the report.
2. Give people clear and conspicuous notice of that process. What good is a process if nobody knows it exists? Each covered platform must publish an easy-to-find, plain-language description of the removal process. That notice can be provided directly or through a prominent link.
3. Follow through. Just establishing a process and telling people about it isn’t enough. Platforms must actually follow through and respond to valid requests by removing the content within 48 hours. And removing the content includes making reasonable efforts to remove known identical copies.
What happens if you don’t comply?
The statute treats noncompliance as a violation of an FTC trade regulation rule, authorizing the FTC to seek civil penalties, restitution, and other equitable relief like injunctions. With penalties of more than $53,000 per violation available, exposure can escalate quickly depending on how the FTC counts individual violations. If, for example, each impression of an image a platform failed to remove were treated as a separate violation, the numbers for a major platform could be staggering.
The FTC has made it clear it intends to enforce aggressively from day one. FTC Chairman Andrew Ferguson has touted the Take It Down Act as a major legislative accomplishment, and said the agency is preparing for “robust enforcement.” The agency has also said it’s building a dedicated complaint-intake site to detect platform noncompliance and actively hiring personnel with the skills necessary to support enforcement.
The mechanics of detecting noncompliance are straightforward. On the front end, the FTC can simply check whether a covered platform has published the required notice. No notice means a violation. On the back end, a single consumer complaint showing that a valid removal request was submitted and the image remains live is enough to establish noncompliance.
Given how easy violations are to detect and prosecute, and how significant the penalties can be, erring on the side of implementing a compliant program is the prudent approach to managing risk. There are no affirmative reporting obligations for covered platforms, but the FTC has authority to investigate and compel documents at any time.
All indicators suggest this statute will not sit on the shelf. If your platform does not yet have a compliant program in place, or you aren’t sure, we can help.