Minnesota's legislature has advanced a bill banning artificial intelligence tools that generate non-consensual intimate imagery. The legislation grants victims the right to sue creators of deepfake nude generation software. Governor Tim Walz will receive the bill for signature.
The measure targets a growing problem. Bad actors use AI image generators to create fake sexually explicit content of real people without consent. Victims face harassment, blackmail, and reputational damage. The bill establishes legal recourse by allowing civil suits against developers and distributors of these tools.
This represents state-level action on AI regulation before federal frameworks solidify. Minnesota joins other jurisdictions implementing explicit bans on non-consensual deepfake pornography. The bill does not appear to target legitimate AI image generation broadly, but rather specific tools designed for sexual abuse.
The move reflects broader pressure on tech companies and AI developers to implement safeguards. Platforms hosting generated content face growing scrutiny from regulators and lawmakers. Minnesota's approach creates liability for tool creators themselves, a stronger enforcement mechanism than content moderation alone.
The bill's passage signals momentum for anti-deepfake legislation across U.S. states before Congress acts. Whether Governor Walz signs remains the final hurdle, though no veto signals have emerged.
