Minnesota bill seeks to disable AI tech used to create nude or sexual images of people

Go Deeper.
Create an account or log in to save stories.
Like this?
Thanks for liking this story! We have added it to a list of your favorite stories.
Minnesota lawmakers are working to close off access to artificial intelligence technology that allows people to create fake nude pictures or pornography of real people.
The Senate Judiciary and Public Safety Committee held extensive discussion Wednesday of a bill to outlaw the use of “nudifying” artificial intelligence technology. It was held over for further debate on how fines could be structured and whether people impacted by the distortions would be eligible for restitution.
The proposal would require AI companies to disable a function that allows AI apps or sites from making someone appear nude in photos or videos.
The bill’s author, Sen. Erin Maye Quade, said it’s meant an extension of the state’s relatively new law barring nonconsensual sexual deepfakes. Right now, the AI technology is available to create photos or videos that depict other people naked without their consent.
Turn Up Your Support
MPR News helps you turn down the noise and build shared understanding. Turn up your support for this public resource and keep trusted journalism accessible to all.
People have taken advantage of the loophole, she said.
“For these AI-generated photos and videos, the harm begins at creation,” said Maye Quade, DFL-Apple Valley. “Dissemination currently in Minnesota of non-consensual, sexual deepfakes is illegal. But they can download these apps on their phones, and they’re doing that. They’re nudifying their teachers, their classmates, their siblings, friends.”
While Maye Quade said sites like Facebook currently don’t allow the nudification functions, some platforms are still catering to users who want to transform images of unsuspecting people into explicit content.
Under the bill, owners of AI websites or applications that fail to remove the feature in Minnesota would face a $500,000 civil fine. The bill also makes it easier for people to sue.
Megan Hurley, a massage therapist, said she was the subject of an AI-generated fake created to look like her in a hypersexual situation. She said the perpetrator also depicted dozens of other women in photos and videos without their consent. Hurley said the experience was traumatizing.
“I cannot overstate the damage this technology has done. It broke me open to be violated this way,” Hurley said. “I missed two months of work, depleted my savings and went into debt to try and recover as well as identify and notify the other women who had been victimized.”
She said the videos could be circulating online forever.
“I don’t understand why this technology exists and I’m appalled there are companies out there making money in this manner,” Hurley said.
Sen. Warren Limmer, R-Maple Grove, said it’s an important issue to tackle but he cautioned the AI landscape was rapidly evolving and it could be difficult to head off future problems.
“In a few short years for producers it may not be such a high technical challenge to create images like this. And we’ve seen that progression that evolution of technology where only commercial producers could create anything of a technical nature,” Limmer said. “And now everyone's making videos and things on their own home computer.”
The committee postponed a vote to allow for consideration later of an amendment that generates funds from the civil fine to be used for survivors of the deepfake offenses. A corresponding House bill has yet to be introduced.
Use the audio player above to listen to a conversation with Sen. Erin Maye Quade on Minnesota Now with Nina Moini.