• Madison420@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    arrow-down
    1
    ·
    6 months ago

    Yeah that’s toothless. They decided there is no particular way to age a cartoon, they could be from another planet that simply seem younger but are in actuality older.

    It’s bunk, let them draw or generate whatever they want, totally fictional events and people are fair game and quite honestly I’d Rather they stay active doing that then get active actually abusing children.

    Outlaw shibari and I guarantee you’d have multiple serial killers btk-ing some unlucky souls.

    • Mike@lemmy.ml
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 months ago

      I think the challenge with Generative AI CSAM is the question of where did training data originate? There has to be some questionable data there.

      • erwan@lemmy.ml
        link
        fedilink
        English
        arrow-up
        0
        ·
        6 months ago

        There is also the issue of determining if a given image is real or AI. If AI were legal, that means prosecution would need to prove images are real and not AI with the risk of letting go real offenders.

        The need to ban AI CSAM is even clearer than cartoon CSAM.

        • Madison420@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          arrow-down
          1
          ·
          edit-2
          6 months ago

          And in the process force non abusers to seek their thrill with actual abuse, good job I’m sure the next generation of children will appreciate your prudish factually inept effort. We’ve tried this with so much shit, prohibition doesn’t stop anything or just creates a black market and a abusive power system to go with it.

    • ZILtoid1991@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 months ago

      My main issue with generation is the ability of making it close enough to reality. Even with the more realistic art stuff, some outright referenced or even traced CSAM. The other issue is the lack of easy differentiation between reality and fiction, and it muddies the water. “I swear officer, I thought it was AI” would become the new “I swear officer, she said she was 18”.

      • Madison420@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        6 months ago

        That is not an end user issue, that’s a dev issue. Can’t train on scam if it isn’t available and as such is tacit admission of actual possession.