from Hacker News

MP Maria Miller wants AI 'nudifying' tool banned

by unklefolk on 8/4/21, 11:11 AM with 70 comments

  • by JulianMorrison on 8/4/21, 11:54 AM

    More precisely, she wants the distribution of nudes without consent, real or faked, to be considered an offense. This isn't about the tool, but what's being done with it.
  • by Nextgrid on 8/4/21, 12:02 PM

    I think outlawing this tool would be counter-productive.

    One of the reasons nudes being released is damaging is because it's a rare enough and noteworthy event. If because of this tool everyone has nudes of them floating around then it would become a normal thing and would actually remove most of the damage around real nudes leaking by providing plausible deniability (assuming anyone ever cares at this point - if the world is drowning in nudes of everyone, the real thing will probably go unnoticed anyway).

    Outlawing the tool wouldn't actually stop malicious usage of it but because only criminals would use it it would make its (rarer) usage more damaging than if anyone could legally use such a tool and nudes stop being a noteworthy event.

  • by isamuel on 8/4/21, 12:08 PM

    In America, the federal child pornography law applies only to depictions of an actual child (and you have to know it, for possession offenses, though that’s another matter). But the Justice Department has long taken the position that an image of a clothed child that’s altered to then make the child look nude—-they used to call these “morphed” images—-counts. I don’t think it’s ever been definitively resolved by the Supreme Court, and I don’t know what the courts of appeals have said, but tools like DeepSukebe have made that argument way more appealing. I’d bet that this is where regulation will begin: images of children. That has always been a domain where American courts have been extremely reluctant to intervene; for example, any visual depiction of a seventeen year old engaged in sex is proscribable without resort to the ordinary inquiry into whether the work as a whole is “obscene,” etc.

    But under reigning American First Amendment law, it gets a lot harder to explain why a law like the one being proposed here would be acceptable. The Supreme Court has, for example, held that the distribution of animal-cruelty videos cannot be forbidden. And it’s not clear to me how one could proscribe the distribution of an imaginary visual depiction of an adult who was nude. You could call it defamatory, I suppose, but if it’s concededly fictional… I don’t know.

  • by basisword on 8/4/21, 11:45 AM

    It will be interesting to see how this kind of thing plays out. I’m sure it’s quite distressing if a tool like this is used on your photo and then potentially shared in your friendship group. Hopefully we very quickly get to the point that nobody will ever be able to know if a photo is real or fake and it’s just not considered an issue therefore. Policing it seems like it would be extremely difficult. Maybe we police the intent? In other words you can produce the images but if you use it maliciously against a person then there is a crime.
  • by prepend on 8/4/21, 11:57 AM

    Looking forward to seeing how this resolves.

    I remember when deepfakes first was released there was a group who would deepfake coworkers, Facebook friends, etc for a really low cost (like $100) as long as the target had a few hundred public photos.

    This is without consent as well, but it’s also not real. It seems like the equivalent of imagining people nude. Kind of creepy if I know it’s happening but not truly a violation of my privacy.

  • by Joakal on 8/4/21, 12:15 PM

    Wow, talk about technology making hijab and similar clothes obsolete which will be a massive culture shock.
  • by sorokod on 8/4/21, 12:09 PM

    There is a continuum there between harmless to deeply offensive. The exact location in the continuum will at the very least depend on the person being subjected to this treatment and the cultural context.

    The "AI" aspect will amplify the offense because of how life-like the end result can be.

  • by knipster on 8/4/21, 12:23 PM

    So... Streisand effect an accident that makes this capability more prevalent and disturbing?

    OR

    Streisand effect intentional to make this so common that it no longer draws attention

  • by Tycho on 8/4/21, 12:04 PM

    Just wait till the AI visual paternity tests get here.
  • by dannyw on 8/4/21, 12:10 PM

    How effective could it be short of an international convention?

    Reality is, unless most countries ban it, it's gonna be on the internet.

  • by villgax on 8/4/21, 12:07 PM

    Wow, next up Photoshop filters/brushes too?