

Hm. I wasn’t expecting the pro-child porn argument. All I can say is that’s absolutely legally and morally CSAM, and you’re fuckin nasty. Oof. Not really gonna bother with the rest because, well, yikes.
Hm. I wasn’t expecting the pro-child porn argument. All I can say is that’s absolutely legally and morally CSAM, and you’re fuckin nasty. Oof. Not really gonna bother with the rest because, well, yikes.
Yeah there’s some nasty shit here. Big yikes, Lemmy.
Are you OK with sexually explicit photos of children taken without their knowledge? They’re not being actively put in a sexual situation if you’re snapping photos with a hidden camera in a locker room, for example. You ok with that?
The harm is:
I don’t know where you’re getting this “thought crime” stuff. They’re talking about boys distributing deepfake nudes of their classmates. They’re not talking about individuals fantasizing in the privacy of their own homes. You have to read all of the words in the sentences, my friend.
Yes, it’s sexual abuse of a child, the same way taking surreptitious locker room photos would be. There’s nothing magical about a photograph of real skin vs a fake. The impact to the victim is the same. The impact to the viewer of the image is the same. Arguing over the semantic definition of “abuse” is getting people tangled up here. If we used the older term, “child porn” people wouldn’t be so hesitant to call this what it is.
How is it different for the victim? What if they can’t tell if it’s a deepfake or a real photo of them?
They may be little sociopaths, but they don’t run around murdering each other. Our culture hammers it into their brains that murder is wrong and they will be severely punished for it. We need to build a culture where little boys are afraid to distribute naked photos of their classmates. Where their friends will turn them in for fear of repercussions. You do that by treating it like a crime, not by saying “boys will be boys” and ignoring the issue.
Treat it like a crime, and address separately the issue of children being tried as adults and facing lifelong consequences. The reforms needed to our juvenile justice system go beyond this particular crime.
Thank you. Focusing on the harm the victims is the right way to understand this issue. Too many people in here hunting for a semantic loophole.
If someone put a camera in the girls’ locker room and distributed photos from that, would you consider it CSAM? No contact would have taken place so the kids would be unaware when they were photographed, is it still abuse?
If so, how is the psychological effect of a convincing deepfake any different?
He looks like Zuckerberg fucked Ellen DeGeneres.
Wealth redistribution I can get behind, even if it doesn’t stop him.
Yeah, I’m happy for AI to take this particular horrifying job from us. Chances are it will be overtuned (too strict), but if there’s a reasonable appeals process I could see it saving a lot of people the trauma of having to regularly view the worst humanity has to offer without major drawbacks.
Hmm, space is a little different because so many products are one-offs. It’s hard to design checklists and detailed procedures when you’re making what are essentially prototypes each time. So you make more general processes and then your engineers apply them as needed to each unique build. It can end up looking like a bit of a mess. Space builds rely a lot on expert techs, good modular documentation, and multiple layers of engineering oversight because things change along the way and you can’t always plan for it.
I’m a process engineer at a different aerospace company. I standardize as much as I can and work hard to make instructions clear but man it’s a struggle. Boeing’s space group needs to pay people enough to retain good talent, because they’re all making decisions all day long.
These are all worn voluntarily. This issue isn’t about the equivalent of scandalously clad young girls, it’s like if girls were being involuntarily stripped of their clothing by their classmates. It’s not about modesty standards it’s about sexual abuse.