Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.
I don’t understand fully how this technology works, but, if people are using it to create sexual content of underage individuals, doesn’t that mean the LLM would need to have been trained on sexual content of underage individuals? Seems like going after the company and whatever it’s source material is would be the obvious choice here
I agree with the other comments, but wanted to add how deepfakes work to show how simple they are, and how much less information they need than LLMs.
Step 1: Basically you take a bunch of photos and videos of a specific person, and blur their faces out.
Step 2: This is the hardest step, but still totally feasable for a decent home computer. You train a neural network to un-blur all the faces for that person. Now you have a neural net that’s really good at turning blurry faces into that particular person’s face.
Step 3: Blur the faces in photos/videos of other people and apply your special neural network. It will turn all the blurry faces into the only face it knows how, often with shockingly realistic results.
My mama always told me, that if someone makes a deepfake of you, then you make a deepfake of them right back!
this advice might get you locked up
My mama also told me that if someone locks you up, then you just lock them up right back.
Honestly I think we need to understand that this is no different to sticking a photo of someone’s head on a porn magazine photo. It’s not real. It’s just less janky.
I would categorise it as sexual harassment, not abuse. Still serious, but a different level
Schools generally means it involves underage individuals, which makes any content using them csam. So in effect, the “AI” companies are generating a ton of csam and nobody is doing anything about it.
Disagree. Not CSAM when no abuse has taken place.
That’s my point.
There’s a thing that was happening in the past. Not sure it’s still happening, due to lack of news about it. It was something called “glamour modeling” I think or an extension of it.
Basically, official/legal photography studios took pictures of child models in swimsuits and revealing clothing, at times in suggestive positions and sold them to interested parties.
Nothing untoward directly happened to the children. They weren’t physically abused. They were treated as regular fashion models. And yet, it’s still csam. Why? Because of the intention behind making those pictures.
The intention to exploit.
Except, you know, the harassment and abuse of said deepfaked individual. Which is sexual in nature. Sexual harassment and abuse of a child using materials generated based on the child’s identity.
Maybe we could have a name for it. Something like Child-based sexual harassment and abuse material… CSHAM, or maybe just CSAM, you know, to remember it more easily.
For example, Louisiana mandates a minimum five-year jail sentence no matter the age of the perpetrator.
That’s just on it’s face stupid. A thirteen year old boy is absolutely gonna wanna see girls in his age group naked. That’s not pedophilia. It’s wanting to see the girls he fantasizes about at school every day. Source: I was a thirteen year old boy.
It shouldn’t be treated the same as when an adult man generates it; there should be nuance. I’m not saying it’s ok for a thirteen year old to generate said content: I’m saying tailor the punishment to fit the reality of the differences in motivations. Leave it to Louisiana to once again use a cudgel rather than sense.
I’m so glad I went through puberty at a time when this kind of shit wasn’t available. The thirteen year old version of me would absolutely have got myself in a lot of trouble. And depending on what state I was in, seventeen year old me could have ended listed as a sex predetor for sending dick pics to my gf cause I produced child pornography. God, some states have stupid laws.
As a father of teenage girls, I don’t necessarily disagree with this assessment, but I would personally see to it that anyone making sexual deepfakes of my daughters is equitably and thoroughly punished.
There is a difference between ruining the life of a 13 year old boy for the rest of his life with no recourse and no expectations.
Vs scaring the shit out of them and making them work their ass off doing an ass load of community service for a summer.
ruining the life of a 13 year old boy for the rest of his life with no recourse
And what about the life of the girl this boy would have ruined?
This is not “boys will be boys” shit. Girls have killed themselves over this kind of thing (I have personal experience with suicidal teenage girls, both as a past friend and as a father).
I don’t think it’s unreasonable to expect an equivalent punishment that has the potential to ruin his life.
God I’m glad I’m not a kid now. I never would have survived.
In my case, other kids would not have survived trying to pull off shit like this. So yeah, I’m also glad I’m not a kid anymore.