close
close

Which state laws protect children against AI-generated deepfakes?

Which state laws protect children against AI-generated deepfakes?

(NewsNation) — Events explicitly generated by AI deepfakes The rapidly increasing number of children in recent years has led to calls in many states to enact laws to protect children.

Lawmakers in more than a dozen states have passed a series of resolutions legislation Ensuring that local prosecutors can bring charges under state law for AI-generated “deepfakes” and other sexually explicit images of children.

Deepfakes are video, photographs or audio recordings that appear real but are manipulated by artificial intelligence. A deepfake can make it look like someone is saying or doing something they never actually said or did.

Many of these laws target sexually explicit or pornographic video images, and some expand existing nonconsensual intimate imaging laws. National Conference of State Legislatures.

AI-generated TikToks and YouTube videos challenge websites that turn photos into AI nudes

States with laws protecting children against deepfakes

Governors in more than a dozen states have signed laws this year cracking down on digitally created or altered child sexual abuse images, according to a review by . National Center for Missing and Exploited Children.

According to an analysis by , 14 states have laws that include specific references to children to protect them against deepfakes and other AI-generated content. MultiState Partners Shared with NewsNation.

These include Utah, Idaho, Georgia, Oklahoma and Tennessee.

The other five states have laws that will come into force as of the beginning of 2025.

In September, California closed a loophole regarding AI-generated child sexual abuse images, making clear that child pornography is illegal even if it is created by AI.

The previous law did not allow district attorneys to go after people who possess or distribute AI-generated child sexual abuse images unless they can prove the materials depict a real person; but under the new laws, this type of crime qualifies as a felony.

AI nudes of young people spread. Now he’s fighting against deepfake porn

In July, South Dakota updated its laws against child sexual abuse images to include AI-generated images. The law includes mandatory minimum sentences of one, five, and 10 years for first-time possession, distribution, and production offenses, respectively.

There are currently no federal laws addressing nonconsensual deepfake pornography, but there is a proposed bill that would address the issue for adults.

Preventing Explicit Fake Images and Nonconsensual Editing Act, or Act of ChallengeIt will allow victims of deepfake pornography to file lawsuits as long as they can prove the deepfakes were made without their consent.

Take Down Act It requires platforms to remove both revenge porn and non-consensual deepfake porn.

But Justice Department officials say they already have the tools under federal law to go after criminals who use such images.

A federal law signed in 2003 bans the production of visual depictions deemed “obscene” of children engaging in sexually explicit behavior, including drawings. The Department of Justice used this law as a charge against cartoon images depicting child sexual abuse, stating that there was no requirement that “the minor depicted actually exists.”

Will deepfake laws work to protect children?

Justin Patchin, a criminal justice professor and co-director at the University of Wisconsin-Eau Claire, said that while the law is an important tool for criminal prosecutions, it likely won’t prevent behavior, especially when other students create deepfakes. between Cyberbullying Research Center.

“Young people are not deterred by the threat of official punishment. “Informal punishments, such as what their friends might think, what their parents might do, or how their teachers might feel about them, deter them more,” he said.

He adds that laws are a “necessary but not sufficient response” to blatant non-consensual deepfakes.

While technology has and likely will continue to outpace legislation, many argue that laws are necessary to help law enforcement and prosecutors go after perpetrators.

“We need to signal early and often that this is a crime, that it will be investigated and prosecuted when the evidence supports it,” Steven Grocki, head of the Justice Department’s Child Exploitation and Obscenity Section, said in an interview with The magazine. Associated Press. “And if you sit there and think otherwise, you are fundamentally wrong. And it’s only a matter of time before someone holds you accountable.”

“These laws exist. They will be used. We have this will. We have the resources,” Grocki said.

After the California legislation became more expansive, Ventura County District Attorney Erik Nasarenko said it paved the way for investigating eight cases related to AI-generated content from last December to mid-September.

AI nude photo scandal sparks calls for greater oversight of the technology

Patchin said it’s more important to focus on education and awareness in both schools and parents about the dangers of deepfakes.

The Associated Press contributed to this story.

Copyright 2024 Nexstar Media, Inc. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.

For the latest news, weather, sports and streaming video, go to Queen City News.