Minnesota Close to Outright Ban on AI Nudification Apps

Minnesota lawmakers are reportedly close to outright banning AI nudification apps that create explicit images of people, mostly women, without consent.

According to a report by Fox 9, there is bipartisan support in the Minnesota Legislature to ban the undressing apps, and a bill is headed for a vote in the House soon.

The bill’s lead author, Democratic Senator Erin May Quade, calls it a slam dunk issue. “The consequences are so devastating that we’re saying that this technology should not be accessible to any Minnesotan at all,” she tells Fox 9.

A ban would be landmark legislation, and it comes after dozens of Minnesota women have fallen victim to the seedy apps. Molly Kelley tells Fox 9 that it’s like a “shadow version of yourself out there, doing things that never happened.”

While Jessica Guistolise says she felt physically unwell after discovering someone had nudified her photos, and it took weeks for her to feel okay again. Fox 9 notes that scientific studies have shown these incidents can affect the brain in the same way as if someone has been physically harmed.

“This is widely available to happen to anyone. Anyone is a potential victim, and then it does follow you for the rest of your life,” says Guistolise. “It was so much worse than I had my imagination had come up with.”

The Minnesota ban has been in the works for at least a year, and it has been amended to protect professional photo editors who run apps like Photoshop.

“We want to make sure we’re adjusting, obviously, for technical skill that can be added and respect art and the First Amendment,” Democratic House Representative Jessica Hanson tells Fox 9.

There’s a wave of anger toward AI nudification apps, including Elon Musk’s Grok, which began undressing hundreds of people at the beginning of the year. The scandal has led to lawsuits, and threats of regulatory action from the U.K. and elsewhere.

This week, Business Insider reported that TikTok runs ads for apps that undress people. An investigation discovered more than 50 “sexually aggressive” ads for nudification apps.

“We have removed content and banned accounts that breach our strict rules against sexual activity, including material created using third-party apps,’ a TikTok spokesperson says.

Image credits: Header photo licensed via Depositphotos.