California governor signs bills to protect children from fake AI nudes

California governor signs bills to protect children from fake AI nudes

SACRAMENTO, Calif. — California Governor Gavin Newsom signed two proposals Sunday aimed at protecting minors from the increasingly widespread misuse of artificial intelligence tools to generate harmful sexual images of children.

The moves are part of California’s concerted efforts to strengthen regulation of the luxury brand industry, which increasingly affects Americans’ daily lives but faces little or no oversight in the United States.

Earlier this month, Newsom also approved some of the toughest laws to combat election deepfakes, although the laws are being challenged in court. California is widely seen as a potential leader in regulating the AI ​​industry in the United States.

The new laws, which received overwhelming bipartisan support, close a legal loophole around AI-generated images of child sexual abuse and make clear that child pornography is illegal even if generated by AI.

Current law does not allow district attorneys to prosecute people who possess or distribute AI-generated images of child sexual abuse if they cannot prove that those materials depict a real person, the prosecutors said. supporters of the association. Under the new laws, such an offense would be considered a crime.

“The creation, possession and distribution of child sexual abuse material must be illegal in California, whether the images are generated by AI or by real children,” Marc Berman, a member of the Assembly Democrat and author of one of the bills. is used to create these horrific images. It is formed from thousands of images of real children being abused, re-victimizing those children again.

Newsom also signed two other bills earlier this month aimed at strengthening revenge porn laws in an effort to protect more women, teens and others from sexual exploitation and harassment. possible by AI tools. It will now be illegal for an adult to create or share sexually explicit AI-generated deepfakes of a person without their consent, under state law. Social media platforms are also required to allow users to report the removal of such content.

But some laws don’t go far enough, said Los Angeles County District Attorney George Gascón, whose office sponsored some of the proposals. Gascón said the new penalties for sharing AI-generated revenge porn should have also included people under 18. The measure was narrowed by state lawmakers last month to apply only to adults.

“There have to be consequences, you don’t get a pass because you’re under 18,” Gascón said in a recent interview.

The laws come after San Francisco filed a first-in-the-nation lawsuit against more than a dozen websites using AI tools with promises to “undress any photo” uploaded to the website within seconds.

The problem of deepfakes is not new, but experts say it is getting worse as the technology to produce them becomes more accessible and easier to use. Researchers have been sounding the alarm over the past two years about the explosion of AI-generated child sexual abuse materials using depictions of real victims or virtual characters.

In March, a Beverly Hills school district expelled five middle school students for creating and sharing fake nudes of their classmates.

This issue has prompted rapid bipartisan action in nearly 30 states to help combat the proliferation of AI-generated sexually abusive materials. Some of these include protection for all, while others only prohibit materials depicting minors.

Newsom touted California as an early adopter and regulator of AI technology, saying the state could soon deploy generative AI tools to combat traffic congestion and provide tax advice, even as his administration considers new rules against AI-related discrimination in hiring practices.