A 13-year-old student was expelled from a Louisiana middle school after hitting a male classmate who she said created and shared a deepfake pornographic image of her, according to her family’s lawyers.

  • Mouselemming@sh.itjust.works
    link
    fedilink
    arrow-up
    30
    arrow-down
    1
    ·
    6 小时前

    If it was a realistic-looking image of a nude 13 year old, isn’t it child porn?

    Sounds like the school and the sheriff’s office only started to investigate once the family got lawyers.

    • Perspectivist@feddit.uk
      link
      fedilink
      arrow-up
      11
      arrow-down
      9
      ·
      4 小时前

      Technically speaking there is no such thing as child porn - it’s abuse material i.e. evidence of a crime. However, there has been no crime when the content is AI generated so it would be categorized as simulated abuse material.

      Child porn as a term shouldn’t really be used at all. It downplays what said content actually is. It’s similar to calling female genital mutilation a “female circumcison”.

      • ObjectivityIncarnate@lemmy.world
        link
        fedilink
        arrow-up
        5
        ·
        3 小时前

        Child porn as a term shouldn’t really be used at all.

        This is, linguistically, an unwinnable fight, imo. People understand what “porn” is(/is meant to be), and ‘child’ is just a descriptor. People are never naturally going to start saying “abuse material” instead of “porn” in instances like these.

        We can’t even get people to consistently say STI instead of STD after all this time. You’ve got to pick your battles, lol.

      • AxExRx@lemmy.world
        link
        fedilink
        arrow-up
        6
        ·
        3 小时前

        Us federal law explicitly uses and defines the term “child pornography” in 18 us 2251-2256.

        From justic.gov:

        Section 2256 of Title 18, United States Code, defines child pornography as any visual depiction of sexually explicit conduct involving a minor (someone under 18 years of age). Visual depictions include photographs, videos, digital or computer generated images indistinguishable from an actual minor, and images created, adapted, or modified, but appear to depict an identifiable, actual minor.

        So no on all accounts. Child pornography is an actually legal term, and ai generation does not get around it if it depicts a real, existant person.

        • astreus@lemmy.ml
          link
          fedilink
          arrow-up
          13
          arrow-down
          15
          ·
          4 小时前

          This is a false comparison. Circumcision has actual medical uses (e.g. phymosis, cancer, balanitis). FGM does not.

            • astreus@lemmy.ml
              link
              fedilink
              arrow-up
              6
              arrow-down
              4
              ·
              3 小时前

              …not everyone gets circumcised at birth. I got circumcised in my 30s due to phymosis. No one undergoes FMG at any age for any medical reason. Conflating the two is deeply unhelpful to both the stigma around medical circumcision and to protect people from the brutality of FMG. Not every country is America.