top of page

The Digital Strip-Search: Is Your Family Legally Naked?

A mother and child in a high-tech dark room look at a transparent screen showing their own bodies as digital, wireframe AI-synthesis silhouettes.

Watching the harvest: Minnesota's new laws target the technology that turns our private lives into synthetic data.

by Gull Sunderland

7 May 2026

One lawmaker stands alone as Minnesota moves to ban AI apps that undress victims — leaving New Zealand's "synthetic" loophole in the spotlight.

The headline out of the United States sounds like progress: on 28 April 2026, the Minnesota House passed HF1606 (House File) by a near-unanimous 132–1 margin. If the Governor signs it, Minnesota will become the first state to flat-out ban the "access, download, or use" of AI nudification technology — the so-called "nudify" apps that can strip the clothes off any victim with a single click.


But while Minnesota builds a digital firewall, a deeper investigation into the legal terrain closer to home reveals a disturbing reality. In New Zealand, your digital likeness — and that of your children — is currently in a state of legal undress that should shock every parent in the "The Naked Truth" community.


The Lone Outlier’s Warning

The Legislative Chamber in the Minnesota State Capitol, where the action happened.
The Legislative Chamber in the Minnesota State Capitol, where the action happened.

The Minnesota vote wasn't without friction. The sole "no" vote, Representative Drew Roach, sparked a debate that we should be paying attention to. He didn't defend the tech — he called it "vile" — but he argued the law targets the tools rather than the perpetrators.


For an investigative team, that’s the smoking gun. If the tool itself is the problem, then the manufacturer is the arms dealer. Minnesota’s bill carries a massive $500,000 civil penalty for companies that facilitate these digital assaults. It’s an aggressive play to bankrupt the "nudify" industry before it becomes too big to fail.


The Predator’s Toolkit

The debate in Minnesota wasn't just about privacy; it was about protection. Representative Jessica Hanson, the bill’s author, didn't mince words when she addressed the House. . .


"This technology has empowered and enabled paedophiles and sexual predators around the globe," Hanson warned. She noted that these tools are being weaponised not just by professional criminals, but against women in positions of trust and, most chillingly, against children victimised by "cruel peers."


For parents, this is the nightmare scenario: a child’s innocent social media photo being fed into an app by a schoolyard bully to create a "nude" image that looks indistinguishable from reality. In Minnesota, the law is finally catching up to that cruelty.


The NZ Loophole: "Synthetic" Isn’t "Intimate" (Yet)

Now, let’s look at the gap in our own backyard. If someone in a New Zealand school or workplace uses one of these apps on you or your teenager today, the Harmful Digital Communications Act (HDCA) 2015 might leave you stranded.


The current law defines an "intimate visual recording" as a recording of a real person in an intimate context — think "revenge porn" or a hidden camera in a changing room. But deepfakes are synthetic. They are pixels and math, not light hitting a sensor. Because the image doesn't feature the victim's actual physical body, our current courts struggle to fit it into the existing definition of "intimate."


Victim advocates have called this gap "woefully behind." Right now, to get a prosecution, you often have to prove the perpetrator had a specific "intent to cause harm" — a notoriously high bar if the defence claims it was "just a prank."


McClure holds up the A3 print of her AI-generated naked deepfake image.
McClure holds up the A3 print of her AI-generated naked deepfake image.

The Race to Close the Gap

There is a rescue mission underway in Parliament as we speak! ACT MP Laura McClure’s Deepfake Digital Harm and Exploitation Bill is designed specifically to kill this loophole. It would expand the definition of "intimate visual recording" to include images that are "created, synthesised, or altered" to appear intimate.


McClure famously demonstrated the urgency of this by holding up a realistic, AI-generated nude deep fake of herself created in mere minutes. "What started as a problem for celebrities can now happen to anyone," she warned.


Our Verdict

As a community that values body positivity and consent, we need to be the first to call out this digital hypocrisy. We celebrate the human form in its natural state by choice and within agreed social boundaries. These apps are the antithesis of everything we stand for — they are tools of non-consensual digital exposure that strip away agency.


Minnesota’s 132-to-1 vote is a signal that the "Wild West" era of AI is ending. But until McClure’s bill passes our own House, your family's digital privacy in New Zealand remains a work in progress. You might choose to be naked on the beach, but the law shouldn't leave your children exposed on the internet.


swipe_up_hand_gesture_top_scroll_up_move
bottom of page