by scarmig on 9/30/23, 4:33 PM
> For years, deepfakes – highly convincing fake videos made using AI – have been used to put women’s faces into often aggressive pornographic videos, without their consent. The videos often appear so real it can be hard for female victims to deny it isn’t really them.
Isn't the better approach to make deepfake technology so ubiquitous that everyone can plausibly deny that it's really them?
by Jigsy on 9/30/23, 5:11 PM
I was thinking about this a while back about how people claim it needs to be based on CSAM/CSEM in order for it to generate these kind of images, which I don't buy because that would make every single model illegal to possess by default.
However if we take something like bestiality for example, the AI is going to know what a naked woman looks like and what a dog looks like all without having to be trained on bestiality.
But if you're to generate those two words in a sexual context it suddenly becomes illegal (at least in certain countries) because of... flimsy reasons.
by iammjm on 9/30/23, 5:33 PM
Why is this illegal? The case I am making:
1. There are people who are broken and attracted to kids
2. These people have urges that they want to fulfill
3. It is clearly better if they fulfill said urges without any kids involved
4. So give them AI-generated kiddie stuff to fulfill their urges
5. Having seen AI-produced bee-pug hybrids I don't think the training data needs actual kiddie stuff
6. Seems like a simple, free of suffering solution for a nasty problem
6. I don't see any valid alternatives except for sterilization, which is not ideal either
What am I missing?
by HayBale on 9/30/23, 5:29 PM
It's crazy disturbing and uncomfortable topic and it kinda remind me of the the interview with a company that creates realistic dolls that can be "hurt", they bleed etc... that changed my opinion about this topic. Do I wish that this stuff did not exists? Yeah, but if somebody find this as a safe ventil for his disturbing urges without inflicting any harm or trauma on others? I would say that this is worthy.
And I am not buying the argument about the this could enable people to move to the "real" stuff we have the same arguments in case of violent video games, movies and music.
But Jesus live and people are sometimes disturbing.
by edu on 9/30/23, 5:17 PM
by Tade0 on 9/30/23, 5:38 PM
As a parent I would like to know which approach demonstrably reduces the harm done to children.
Is there any reliable data on that?
by laurentlassalle on 9/30/23, 4:30 PM
> The images were not distributed
How did they find him?
by xboxnolifes on 9/30/23, 9:02 PM
Isn't all porn banned in South Korea? Is the child aspect actually relevant to the case here, or is it just socially relevant?
by ushtaritk421 on 9/30/23, 5:26 PM
I don’t think I have a strong opinion on whether computer generated CSAM increases or decreases actual child abuse (could see it going either way or even different ways depending on the time scale).
That aside, sometimes we have laws that exist not just for their direct good or bad effects but because they exclude someone we think should be excluded. And if you’re picking people to exclude, people who constitutionally want to rape your kids doesn’t seem like the worst choice.
by pengaru on 9/30/23, 7:45 PM
Is it illegal to draw child porn in South Korea?
What I'm curious about is how this will play out in Japan, where there's already a thriving industry of animated cheese pizza.
by Jigsy on 9/30/23, 4:20 PM
Had to truncate the title slightly because it was too long.
by kevinsync on 9/30/23, 6:06 PM
by throwaway4836 on 9/30/23, 4:45 PM
I'm open to being convinced to change my mind here because I'm obviously in the minority. I don't understand why this is illegal. Yes, I agree it's incredibly creepy and weird and uncomfortable but there is no victim here. I'm cool with non-ai CSAM being illegal because it implies the existence of a real victim who was severely harmed and then continuously exploited. If it's all AI-generated though, then who's getting hurt? I don't think it should be illegal just because it's weird