In May 2025, President Donald Trump signed the federal “Take It Down Act”. This law prohibits the publication of intimate images, including deepfakes content, without the consent of the person depicted. The law has received a lot of support both in Congress and among technology companies, such as:
- Meta
- TikTok
First Lady Melania Trump has been a strong supporter of the initiative. She has emphasized its importance in protecting minors from harmful online content.
What Does the “Take It Down Act” Do?
The law criminalizes the intentional distribution of intimate images. It covers both real images and images created by artificial intelligence without the consent of the person depicted. The penalty for a violation can be up to three years in prison. This applies to both legally obtained photos and videos, as well as content created using deepfakes. For example, it is now a crime to publish a sexually explicit video made from a photo of a victim without their consent.
Obligations of Internet Companies
The law also requires technology platforms to remove such content within 48 hours. This creates a mechanism to quickly protect the rights of victims from online exploitation. Companies that fail to meet this deadline may face administrative and civil liability. This includes fines and possible legal action by victims.
Melania Trump’s Role and Public Response
The U.S. First Lady gave a speech in support of the bill, stating that “this legislation is a national victory that will help families protect their children from online exploitation. It was noted that teenagers, especially girls, are often victims of offensive digital content, including pornographic deepfakes. The law also received support from prominent public figures. Businesswoman and DJ Paris Hilton called it “a decisive step in the fight against the unauthorized distribution of images on the Internet.
Legal Consequences of Violating the Law
Criminal liability. Violators of the law will be prosecuted. Punishment is provided in the form of:
- Imprisonment for up to 3 years
- Fines
- A criminal record, which may affect employment and the ability to obtain licenses
Special attention is paid to the protection of minors. If the content distributed involves a child, a violation of child pornography laws may be added to the charges.
Ability to File a Civil Lawsuit
Victims may also file a civil lawsuit for damages, including:
- Moral damages
- Loss of reputation
- Loss of profits
- Legal costs
You can read more about this in our article Defamation Claim in California.
Critics’ Views
Despite widespread support, the law has raised concerns from digital rights organizations. The Electronic Frontier Foundation has stated that the current version of the bill could threaten free speech and privacy rights and undermine encryption mechanisms.
The Internet Society, for its part, emphasized that the requirement to remove content within 48 hours could create a risk of over-moderation. This means that platforms will automatically remove content without proper legal assessment. However, the law does include procedural safeguards. These include the possibility of an appeal and the obligation to notify the alleged infringer.
Child Protection As a Key Objective
The Take It Down Act is particularly important in the context of combating cyberbullying and online exploitation of youth. According to reports from the National Center for Missing and Exploited Children (NCMEC), complaints about the distribution of intimate content to minors continue to rise. Melania Trump, speaking at the March 2025 hearing, stated: “It’s heartbreaking to see what teens, especially girls, are going through when faced with the devastating effects of deepfakes and revenge porn.” This legislation is the result of her vigorous campaign to create a safe digital environment for future generations.
How the Law Applies In California?
In California, privacy has traditionally been a high priority. The new federal law supplements existing state laws, including California Penal Code Section 647. In addition, individuals in California who are injured by a violation of the new federal law may be able to bring additional claims under state law. You can learn more about the options for filing a civil lawsuit in California in our article The Law in California Involving Intentional Infliction of Emotional Distress.
Potential Impact on the Technology Industry
The passage of the law also requires Internet companies to review their content moderation policies. Companies will be required to create a mechanism to respond quickly to user complaints and implement systems to track deepfakes. Google, Meta, and TikTok supported the initiative, recognizing the need for new rules amid the development of AI. However, they cautioned that a balance must be struck between protecting the rights of victims and preserving free speech.
The Problem of Deepfakes and Children: The Legal Challenge of The New Era
In recent years, image-generation technologies based on artificial intelligence have reached an ambitious level. They make it possible to create realistic deepfake videos, sometimes almost indistinguishable from the real thing. This has created a new layer of legal issues, especially when minors are involved. The introduction of the Take It Down Act was a response to the rapidly growing threat of digital abuse, including child abuse.
It is important to note that deepfakes depicting minors may not only fall under the provisions of the new law. They may also fall under the stricter provisions for distribution of child pornography. However, while previous criminal proceedings required proof of the image’s authenticity, the new law now officially recognizes even deep fakes as a crime. This makes the law a more versatile tool in the fight against child exploitation in the digital space.
In addition, the parents or legal guardians of the minors concerned can now seek compensation through civil proceedings. They may rely on provisions related to invasion of privacy and emotional harm. When the dissemination of intimate material is intentional, the action may also include a claim for moral damages.
Enforcement Complexity and Technology Challenges
Despite the law’s lofty goals, effective enforcement requires significant resources from the government and technology platforms. The question is, how do we determine if an image is a deepfake? Companies need to implement automated systems to detect fake content. This is due to the risk of false positives and potential censorship of legitimate material.
In addition, the lack of a consistent appeals system can lead to restrictions on free speech. For example, a person falsely accused of posting prohibited content risks being blocked without the ability to recover their data. In this regard, the legal community calls for the creation of a transparent and fair dispute resolution process. This will help balance the goals of the law with the fundamental rights of users, including:
- Freedom of speech
- The presumption of innocence
- The right to a fair trial
Conclusion
The Take It Down Act is an important step in the fight against digital violence, especially against children and young people. Despite the controversy over potential restrictions on freedom of expression, the Act provides a real and effective tool for victims to protect themselves. It is crucial for parents, educational institutions and digital platforms to be aware of the changes in the legislation and to take measures to protect minors in advance. This is necessary to prevent their exposure to harmful online content.
If you have been a victim of revenge porn or deepfake, contact KAASS LAW. Call 844-522-7752 for a free consultation. We can help you evaluate your case, identify possible legal options, and pursue justice.