Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

  • lath@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    Schools generally means it involves underage individuals, which makes any content using them csam. So in effect, the “AI” companies are generating a ton of csam and nobody is doing anything about it.

      • Lka1988@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        edit-2
        1 month ago

        Except, you know, the harassment and abuse of said deepfaked individual. Which is sexual in nature. Sexual harassment and abuse of a child using materials generated based on the child’s identity.

        Maybe we could have a name for it. Something like Child-based sexual harassment and abuse material… CSHAM, or maybe just CSAM, you know, to remember it more easily.

      • lath@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        2 months ago

        There’s a thing that was happening in the past. Not sure it’s still happening, due to lack of news about it. It was something called “glamour modeling” I think or an extension of it.

        Basically, official/legal photography studios took pictures of child models in swimsuits and revealing clothing, at times in suggestive positions and sold them to interested parties.

        Nothing untoward directly happened to the children. They weren’t physically abused. They were treated as regular fashion models. And yet, it’s still csam. Why? Because of the intention behind making those pictures.

        The intention to exploit.