Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

    • Lka1988@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      edit-2
      25 days ago

      Except, you know, the harassment and abuse of said deepfaked individual. Which is sexual in nature. Sexual harassment and abuse of a child using materials generated based on the child’s identity.

      Maybe we could have a name for it. Something like Child-based sexual harassment and abuse material… CSHAM, or maybe just CSAM, you know, to remember it more easily.

    • lath@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      25 days ago

      There’s a thing that was happening in the past. Not sure it’s still happening, due to lack of news about it. It was something called “glamour modeling” I think or an extension of it.

      Basically, official/legal photography studios took pictures of child models in swimsuits and revealing clothing, at times in suggestive positions and sold them to interested parties.

      Nothing untoward directly happened to the children. They weren’t physically abused. They were treated as regular fashion models. And yet, it’s still csam. Why? Because of the intention behind making those pictures.

      The intention to exploit.