- The law criminalizes sharing intimate images without consent.
- The First Lady calls it a victory for protecting families and children.
- Social media platforms must remove flagged content under the law.
WASHINGTON: President Donald Trump has enacted legislation that makes it illegal to share intimate images—whether authentic or manipulated—without the subject’s consent.
This law specifically targets “deepfakes,” which are digital media often created using artificial intelligence to harass or humiliate individuals online.
Under the new rules, anyone who shares this type of content without permission could face up to three years in prison.
The Take It Down Act, which received strong bipartisan support, makes the unauthorized distribution of intimate images a criminal offense and also requires their removal from online platforms.
“As AI-generated images become more common, many women have suffered harassment from deepfakes and other explicit content shared against their will,” Trump stated during the bill signing in the White House Rose Garden.
“Today, we are making this behavior completely illegal,” the President added. “Anyone who intentionally shares explicit images without the individual’s consent will be subject to a three-year prison sentence.”
First Lady Melania Trump publicly backed the bill in early March and made a rare appearance at the signing ceremony.
Since her husband took office on January 20, she has mostly been low-profile, spending limited time in Washington.
During her remarks, the First Lady described the legislation as a “national victory for parents and families striving to protect children from online threats.”
“This law marks a significant advancement in our mission to ensure that every American, especially young people, can feel more secure against the misuse of their images or identities,” she noted.
Deepfakes utilize artificial intelligence to produce realistic but fake videos, which can include explicit content distributed without the consent of the individuals depicted.
While some states like California and Florida have laws against sharing sexually explicit deepfakes, there are worries that the Take It Down Act may provide authorities with excessive censorship power.
The Electronic Frontier Foundation, an organization dedicated to preserving free speech, warns that the law could empower those in power to unduly influence platforms into censoring legitimate content they disapprove of.
The Act mandates that social media companies and websites implement measures to promptly remove non-consensual intimate imagery once notified by a victim.
Addressing Harassment and Exploitation
The rise in non-consensual deepfakes is currently outpacing global attempts to regulate related technologies, fueled by a surge in AI tools, including applications that can digitally strip clothing from images.
High-profile victims of deepfake videos include celebrities like Taylor Swift, but experts caution that women who are not in the spotlight are equally at risk.
Reports of deepfake scandals have emerged in schools across the U.S., with many teenagers targeted by their peers.
These unauthorized images can lead to harassment, bullying, and blackmail, potentially resulting in severe mental health issues, experts warn.
Renee Cummings, an expert in AI ethics and criminology at the University of Virginia, described the new law as a “crucial move” in combating the risks posed by AI-generated deepfakes and non-consensual content.
“Its success hinges on prompt enforcement, harsh penalties for offenders, and adaptability to evolving digital dangers,” Cummings told AFP.