from Hacker News

Google and Microsoft Are Supercharging AI Deepfake Porn

by imichael on 8/24/23, 8:20 PM with 13 comments

  • by zb3 on 8/24/23, 10:16 PM

    Another bullshit article. Thanks to articles like this companies waste resources on "ai safety" which practically means the model refuses basic requests because "oh it might offend someone and some journalist might smear our company".

    Meanwhile open-source models are already there and are be here to stay. So instead of trying to uphold the unsustainable status-quo that images are trustworthy, it's time to realize they aren't.

  • by 8f2ab37a-ed6c on 8/24/23, 9:25 PM

    If deepfake porn becomes trivial to generate, doesn't that by definition make all revenge and extortion porn worthless because now anybody can make porn of anybody else?

    It seems like a natural continuation of being able to Photoshop people's faces onto strangers in compromising situations. Is this not just a better version of that?

    Seems like fighting it would only make fake porn more valuable and dangerous.

  • by chrisjj on 8/24/23, 8:35 PM

    > “deepfakes” — videos made with artificial intelligence that fabricate a lifelike simulation of a sexual act featuring the face of a real woman.

    What?? AI doesn't work on men?? ;)

  • by ChrisArchitect on 8/24/23, 8:24 PM

    Related recent article:

    Inside the AI porn marketplace https://news.ycombinator.com/item?id=37248949

  • by chrisjj on 8/24/23, 8:30 PM

  • by deterministic on 8/26/23, 3:20 AM

    I f*** hate it when tech companies try to play the role of moral authorities. Just because some journalist tries to score easy click points talking about porn.