Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

    • argl@feddit.org
      link
      fedilink
      English
      arrow-up
      4
      ·
      9時間前

      Can’t afford this much cheese today to find just the right slice for every bikini photo…

  • vane@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    ·
    20時間前

    Maybe let’s assume all digital images are fake and go back to painting. Wait… what if children start painting deepfakes ?

    • aceshigh@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      2
      ·
      23時間前

      To add to that. I live in a red area and since the election I’ve been cat called much more. And it’s weird too, cus I’m middle aged…. I thought I’d finally disappear…

  • SabinStargem@lemmy.today
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    4
    ·
    18時間前

    Deepfakes might end up being the modern version of a bikini. In the olden days, people wore these to the beach. Having less was scandalous and moral decay. Yet, now we wear much less.

    Our grandchildren might simply not give a damn about their nudity, because it is assumed that everyone is deepfaking everyone.

    • atomicorange@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      1
      ·
      11時間前

      These are all worn voluntarily. This issue isn’t about the equivalent of scandalously clad young girls, it’s like if girls were being involuntarily stripped of their clothing by their classmates. It’s not about modesty standards it’s about sexual abuse.

      • Gsus4@mander.xyz
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        2時間前

        Unless it is used to pretend that it is a real video and circulated for denigration or blackmail, it is very much not at all like assault. And also, deepfakes do not have the special features hidden under your clothes, so it is possible to debunk those if you really have to.

      • SabinStargem@lemmy.today
        link
        fedilink
        English
        arrow-up
        2
        ·
        5時間前

        It can be both. The cornerstone of why nudity can be abused, is that society makes it shameful to be bare. If some generations from now that people can just shrug and not care, that is one less tool an abuser can use against people.

        In any case, I am of the mind that people of my generation might be doing their own version of the Satanic Panic, or the reaction against rap music. For better or worse, older people cannot relate to the younger.

    • youmaynotknow@lemmy.ml
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      18
      ·
      1日前

      In my case, other kids would not have survived trying to pull off shit like this. So yeah, I’m also glad I’m not a kid anymore.

  • some_guy@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    72
    arrow-down
    2
    ·
    2日前

    For example, Louisiana mandates a minimum five-year jail sentence no matter the age of the perpetrator.

    That’s just on it’s face stupid. A thirteen year old boy is absolutely gonna wanna see girls in his age group naked. That’s not pedophilia. It’s wanting to see the girls he fantasizes about at school every day. Source: I was a thirteen year old boy.

    It shouldn’t be treated the same as when an adult man generates it; there should be nuance. I’m not saying it’s ok for a thirteen year old to generate said content: I’m saying tailor the punishment to fit the reality of the differences in motivations. Leave it to Louisiana to once again use a cudgel rather than sense.

    I’m so glad I went through puberty at a time when this kind of shit wasn’t available. The thirteen year old version of me would absolutely have got myself in a lot of trouble. And depending on what state I was in, seventeen year old me could have ended listed as a sex predetor for sending dick pics to my gf cause I produced child pornography. God, some states have stupid laws.

    • AA5B@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      ·
      1日前

      In general, even up here in woke-ville, punishments have gotten a lot more strict for kids. There’s a lot more involvement of police, courts, jail. As a parent it causes me a lot of anxiety - whatever happened to school being a “sandbox” where a kid can make mistakes without adult consequences, without ruining their lives? Did that ever exist?

      • BlackPenguins@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        3時間前

        I can already picture that as an Onion headline:

        New York Renames State to ‘WokeVille’. NYC to follow.

      • jwmgregory@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        8
        ·
        20時間前

        it existed if society liked you enough.

        fascists just have a habit of tightening that belt smaller and smaller, is what’s going on.

    • Lka1988@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      2
      ·
      edit-2
      2日前

      As a father of teenage girls, I don’t necessarily disagree with this assessment, but I would personally see to it that anyone making sexual deepfakes of my daughters is equitably and thoroughly punished.

      • seralth@lemmy.world
        link
        fedilink
        English
        arrow-up
        21
        arrow-down
        1
        ·
        1日前

        There is a difference between ruining the life of a 13 year old boy for the rest of his life with no recourse and no expectations.

        Vs scaring the shit out of them and making them work their ass off doing an ass load of community service for a summer.

        • Lka1988@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          18
          arrow-down
          7
          ·
          1日前

          ruining the life of a 13 year old boy for the rest of his life with no recourse

          And what about the life of the girl this boy would have ruined?

          This is not “boys will be boys” shit. Girls have killed themselves over this kind of thing (I have personal experience with suicidal teenage girls, both as a past friend and as a father).

          I don’t think it’s unreasonable to expect an equivalent punishment that has the potential to ruin his life.

          • DancingBear@midwest.social
            link
            fedilink
            English
            arrow-up
            22
            arrow-down
            6
            ·
            1日前

            Fake pictures do not ruin your life… sorry…

            Our puritanical / 100% sex culture is the problem, not fake pictures…

          • Vinstaal0@feddit.nl
            link
            fedilink
            English
            arrow-up
            15
            arrow-down
            1
            ·
            1日前

            It is not abnormal to see different punishment for people under the age of 18. Good education about sex and what sexual assault does with their victims (same with guns, drugs including alcohol etc).

            You can still course correct the behaviour of a 13 year old. There is also a difference between generating the porn and abusing it by sharing it etc.

            The girls should be helped and the boys should be punished, but mainly their behaviour needs to be correcte

          • youmaynotknow@lemmy.ml
            link
            fedilink
            English
            arrow-up
            7
            arrow-down
            6
            ·
            edit-2
            1日前

            Parents are responsible for their kids. The punishment, with the full force of the law (and maybe something extra for good measure), should fall upon the parents, since they should have made sure their kids knew how despicable and illegal doing this is.

            Yeah, I agree, we shouldn’t ruin the boys life, we should ruins his whole family to many times the extent something like this ruins a teen girl’s life.

            • some_guy@lemmy.sdf.org
              link
              fedilink
              English
              arrow-up
              12
              arrow-down
              2
              ·
              24時間前

              Yeah, I agree, we shouldn’t ruin the boys life, we should ruins his whole family to many times the extent something like this ruins a teen girl’s life.

              You’re a fucking asshole. This isn’t like prosecuting parents who let a school shooter have access to guns. The interenet is everywhere. Parents are responsible for bringing up their children to be socially responsible. A thirteen year old kid is anything but responsible (I mean their mentality / maturity, I’m not giving them a pass).

              Go hang out with conservatives who want more policing. Over here, we’ll talk about social programs you fucking prick.

              • youmaynotknow@lemmy.ml
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                4
                ·
                19時間前

                I am an asshole, that’s never been in question, and I fully own it. Having said that, no amount of “social programs” is going to have any effect if fucking parents don’t raise their kids right.

                I’m entirely against surveillance, except when it comes to parents and keeping a close eye on everything their kids watch, browse or otherwise access (evidently making it known to the kids that “I can see EVERYTHING you see and do”).

                So, yeah, hang the imbecile parents that should not have had kids in the first place because a fucking social program or school would raise them instead. Fuck off.

                • Lka1988@lemmy.dbzer0.com
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  18時間前

                  social program

                  And thanks to the assholes in Congress who just passed the Big Betrayal Bill, those are all going away.

            • Lka1988@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              2
              ·
              18時間前

              Teenagers are old enough to understand consequences.

              In fact, my neighborhood nearly burned down last week because a teenager, despite being told “no” and “stop” multiple times - including by neighbors - decided to light off fireworks on the mountainside right behind the neighborhood.

              Red arrow is my house. We were damn lucky the wind was blowing the right direction. If this had happened the day before, the neighborhood would be gone.

              • jsomae@lemmy.ml
                link
                fedilink
                English
                arrow-up
                3
                ·
                8時間前

                some day I hope to be brave enough to post pictures of my house on the internet

      • some_guy@lemmy.sdf.org
        link
        fedilink
        English
        arrow-up
        14
        ·
        1日前

        Yes, absolutely. But with recognition that a thirteen year old kid isn’t a predator but a horny little kid. I’ll let others determine what that punishment is, but I don’t believe it’s prison. Community service maybe. Written apology. Stuff like that. Second offense, ok, we’re ratcheting up the punishment, but still not adult prison.

        • tomenzgg@midwest.social
          link
          fedilink
          English
          arrow-up
          4
          ·
          21時間前

          In a properly functioning world, this could easily be coupled with particular education on power dynamics and a lesson on consent, giving proper attention to why this might be more harmful to get than to him.

          Of course, – so long as we’re in this hypothetical world – you’d just have that kind of education be a part of sex ed. or the like for all students, to begin with, but, as we’re in this world and that’s Louisiana…

        • Lka1988@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          4
          ·
          1日前

          I did say equitable punishment. Equivalent. Whatever.

          A written apology is a cop-out for the damage this behaviour leaves behind.

          Something tells me you don’t have teenage daughters.

          • some_guy@lemmy.sdf.org
            link
            fedilink
            English
            arrow-up
            3
            ·
            1日前

            No kids. That’s why I say others should write the punishments. A written apology wasn’t meant as the only punishment. It was in addition to community service and other stipulations.

    • Agent641@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      1
      ·
      2日前

      Punishment for an adult man doing this: Prison

      Punishment for a 13 year old by doing this: Publish his browsing and search history in the school newsletter.

  • Daftydux@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    15
    ·
    1日前

    Welp, if I had kids they would have one of those scramble suits like in a scanner darkly.

    It would of course be their choice to wear them but Id definitely look for ways to limit their time in areas with cameras present.

    • Entertainmeonly@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      14
      ·
      1日前

      That’s just called the outside now. Assume you are on camera at all times the moment you step out the front door. To be safe in the surveillance we live in today, best act as though you are being recorded in your own home as well.

      • Vanilla_PuddinFudge@infosec.pub
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        18時間前

        best act as though you are being recorded in your own home as well.

        If you don’t know, don’t try? Seems a bit defeatist.

        There’s also the matter of “you” the NPC and well… “You”.

        You can rest easy knowing Trump knows you’re at work, but not the contents of the monologue you gave on Palestine on a political XMPP chatroom.

      • Daftydux@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        23時間前

        You can make areas safe from cameras. No, you cant make everywhere camera free but you can minimize your time in those areas. Im not saying its a good system it would just be adjusting to the times.

        If the floor was lava and all that…

  • dinckel@lemmy.world
    link
    fedilink
    English
    arrow-up
    144
    arrow-down
    1
    ·
    2日前

    Lawmakers are grappling with how to address …

    Just a reminder that the government is actively voting against regulations on AI, because obviously a lot of these people are pocketing lobbyist money

  • Walk_blesseD@piefed.blahaj.zone
    link
    fedilink
    English
    arrow-up
    26
    arrow-down
    12
    ·
    2日前

    Jfc the replies here are fucking rancid. Lemmy is full of sweaty middle aged blokes in tech who hate it when anyone tells them that grown men who pursue teenage girls who have just reached an arbitrary age are fucking creeps, so of course they’re here encouraging the next generation of misogynist scum by defending this shit, too.
    And men (pretend to) wonder why we distrust them.

    Ngl, I’m only leaving reply notifs on for this one to work on my blocklist.

  • wewbull@feddit.uk
    link
    fedilink
    English
    arrow-up
    72
    arrow-down
    20
    ·
    2日前

    Honestly I think we need to understand that this is no different to sticking a photo of someone’s head on a porn magazine photo. It’s not real. It’s just less janky.

    I would categorise it as sexual harassment, not abuse. Still serious, but a different level

    • lath@lemmy.world
      link
      fedilink
      English
      arrow-up
      50
      arrow-down
      6
      ·
      2日前

      Schools generally means it involves underage individuals, which makes any content using them csam. So in effect, the “AI” companies are generating a ton of csam and nobody is doing anything about it.

      • LostXOR@fedia.io
        link
        fedilink
        arrow-up
        20
        ·
        2日前

        Do deepfake explicit images created from a non-explicit image actually qualify as CSAM?

        • Lka1988@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          edit-2
          2日前

          I would consider that as qualifying. Because it’s targeted harassment in a sexually-explicit manner. All the girl would have to do is claim it’s her.

          Source: I’m a father of teenage daughters. I would pursue the individual(s) who started it and make them regret their choices.

        • lath@lemmy.world
          link
          fedilink
          English
          arrow-up
          10
          arrow-down
          3
          ·
          2日前

          I don’t know personally. The admins of the fediverse likely do, considering it’s something they’ve had to deal with from the start. So, they can likely answer much better than I might be able to.

        • surewhynotlem@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          8
          ·
          2日前

          Drawing a sexy cartoon that looks like an adult, with a caption that says “I’m 12”, counts. So yeah, probably.

          • cole@lemdro.id
            link
            fedilink
            English
            arrow-up
            1
            ·
            7時間前

            This actually is quite fuzzy and depends on your country and even jurisdiction in your country

      • wewbull@feddit.uk
        link
        fedilink
        English
        arrow-up
        23
        arrow-down
        17
        ·
        2日前

        Disagree. Not CSAM when no abuse has taken place.

        That’s my point.

        • Lka1988@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          1
          ·
          edit-2
          2日前

          Except, you know, the harassment and abuse of said deepfaked individual. Which is sexual in nature. Sexual harassment and abuse of a child using materials generated based on the child’s identity.

          Maybe we could have a name for it. Something like Child-based sexual harassment and abuse material… CSHAM, or maybe just CSAM, you know, to remember it more easily.

        • Zak@lemmy.world
          link
          fedilink
          English
          arrow-up
          21
          arrow-down
          2
          ·
          2日前

          I think generating and sharing sexually explicit images of a person without their consent is abuse.

          That’s distinct from generating an image that looks like CSAM without the involvement of any real child. While I find that disturbing, I’m morally uncomfortable criminalizing an act that has no victim.

        • atomicorange@lemmy.world
          link
          fedilink
          English
          arrow-up
          15
          arrow-down
          7
          ·
          2日前

          If someone put a camera in the girls’ locker room and distributed photos from that, would you consider it CSAM? No contact would have taken place so the kids would be unaware when they were photographed, is it still abuse?

          If so, how is the psychological effect of a convincing deepfake any different?

          • General_Effort@lemmy.world
            link
            fedilink
            English
            arrow-up
            9
            ·
            2日前

            If someone puts a camera in a locker room, that means that someone entered a space where you would usually feel safe. It implies the potential of a physical threat.

            It also means that someone observed you when you were doing “secret” things. One may feel vulnerable in such situations. Even a seasoned nude model might be embarrassed to be seen while changing, maybe in a dishevelled state.

            I would think it is very different. Unless you’re only thinking about the psychological effect on the viewer.

          • BombOmOm@lemmy.world
            link
            fedilink
            English
            arrow-up
            20
            arrow-down
            13
            ·
            edit-2
            2日前

            Taking secret nude pictures of someone is quite a bit different than…not taking nude pictures of them.

            It’s not CSAM to put a picture of someone’s face on an adult model and show it to your friend. It’s certainly sexual harassment, but it isn’t CSAM.

            • atomicorange@lemmy.world
              link
              fedilink
              English
              arrow-up
              6
              arrow-down
              4
              ·
              2日前

              How is it different for the victim? What if they can’t tell if it’s a deepfake or a real photo of them?

              • BombOmOm@lemmy.world
                link
                fedilink
                English
                arrow-up
                12
                arrow-down
                9
                ·
                edit-2
                2日前

                It’s absolutely sexual harassment.

                But, to your question: you can’t just say something has underage nudity when the nudity is of an adult model. It’s not CSAM.

                • atomicorange@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  15
                  arrow-down
                  6
                  ·
                  2日前

                  Yes, it’s sexual abuse of a child, the same way taking surreptitious locker room photos would be. There’s nothing magical about a photograph of real skin vs a fake. The impact to the victim is the same. The impact to the viewer of the image is the same. Arguing over the semantic definition of “abuse” is getting people tangled up here. If we used the older term, “child porn” people wouldn’t be so hesitant to call this what it is.

        • lath@lemmy.world
          link
          fedilink
          English
          arrow-up
          9
          arrow-down
          5
          ·
          2日前

          There’s a thing that was happening in the past. Not sure it’s still happening, due to lack of news about it. It was something called “glamour modeling” I think or an extension of it.

          Basically, official/legal photography studios took pictures of child models in swimsuits and revealing clothing, at times in suggestive positions and sold them to interested parties.

          Nothing untoward directly happened to the children. They weren’t physically abused. They were treated as regular fashion models. And yet, it’s still csam. Why? Because of the intention behind making those pictures.

          The intention to exploit.

    • LadyAutumn@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      36
      arrow-down
      12
      ·
      edit-2
      2日前

      Yes, finding out that your peers have been sharing deep fake pornography of you is absolutely fine and a normal thing for young girls to go through in school. No girls have ever killed themselves because of this exact sort of thing, surely. This definitely will not add in any way to the way women and girls are made to feel entirely disgustingly dehumanized by every man or boy in their lives. Groups of men and boys reducing them and their bodies down to vivid sexual fantasies that they can quickly generate photo realistic images of.

      If the person in the image is underaged then it should be classified as child pornography. If the woman who’s photo is being used hasnt consented to this then it should be classified as sexual exploitation.

      Women and girls have faced degrees of this kind of sexual exploitation by men and boys since the latter half of the 20th century. But this is a severe escalation in that behavior. It should be illegal to do this and it should be prosecuted when and where it is found to occur.

      • FishFace@lemmy.world
        link
        fedilink
        English
        arrow-up
        18
        arrow-down
        3
        ·
        2日前

        It’s bullying with a sexual element. The fact that it uses AI or deepfakes is secondary, just as it was secondary when it was photoshop, just as it was secondary when it was cutting out photos. It’s always about using it bully someone.

        This is different because it’s easier. It’s not really different because it (can be) more realistic, because it was never about being realistic, otherwise blatantly unrealistic images wouldn’t have been used to do it. Indeed, the fact that it can be realistic will help blunt the impact of the leaking of real nudes.

        • LadyAutumn@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          13
          arrow-down
          12
          ·
          2日前

          It’s sexually objectifying the bodies of girls and turning them into shared sexual fantasies their male peers are engaging in. It is ABSOLUTELY different because it is more realistic. We are talking about entire deep fake porngraphy production and distribution groups IN THEIR OWN SCHOOLS. The amount of teenage boys cutting pictures out and photoshopping them was nowhere near as common as this is fast becoming and it was NOT the same as seeing a naked body algorithmically derived to appear as realistic as possible.

          Can you stop trying to find a silver lining in the sexual exploitation of teenage girls? You clearly don’t understand the kinds of long term psychological harm that is caused by being exploited in this way. It was also exploitative and also fucked up when it was in photoshop, this many orders of magnitude more sophisticated and accessible.

          Youre also wrong that this is about bullying. Its an introduction to girls being tools for male sexual gratification. It’s LITERALLY commodifiying teenage girls as sexual experiences and then sharing them in groups together. It’s criminal. The consent of the individual has been entirely erased. Dehumanization in its most direct form. It should be against the law and it should be prosecuted very seriously wherever it is found to occur.

          • rottingleaf@lemmy.world
            link
            fedilink
            English
            arrow-up
            16
            arrow-down
            11
            ·
            2日前

            Can you stop trying to find a silver lining in the sexual exploitation of teenage girls?

            Can you please use words by their meaning?

            Also I’ll have to be blunt, but - every human has their own sexuality, with their own level of “drive”, so to say, and their dreams.

            And it’s absolutely normal to dream of other people. Including sexually. Including those who don’t like you. Not only men do that, too. There are no thought crimes.

            So talking about that being easier or harder you are not making any argument at all.

            However. As I said elsewhere, the actions that really harm people should be classified legally and addressed. Like sharing such stuff. But not as making child pornography because it’s not, and not like sexual exploitation because it’s not.

            It’s just that your few posts I’ve seen in this thread seem to say that certain kinds of thought should be illegal, and that’s absolute bullshit. And laws shouldn’t be made based on such emotions.

            • youmaynotknow@lemmy.ml
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              1
              ·
              edit-2
              19時間前

              “thought crime”? And you have the balls to talk about using words “by their meaning”?

              This is a solid action with a product to show for it, not a thought, which happens to impact someone’s life negatively without their consent, with potentially devastating consequences for the victim. So, can you please use words by their meaning?

              Edit: I jumped the gun when I read “thought crime”, effectively disregarding the context. As such, I’m scratching the parts of my comment that don’t apply, and leaving the ones that do apply (not necessarily to the post I was replying to, but to the whole thread).

              • rottingleaf@lemmy.world
                link
                fedilink
                English
                arrow-up
                5
                ·
                edit-2
                1日前

                The author of those comments wrote a few times what in their opinion happens in the heads of others and how that should be prevented or something.

                Can you please stop interpreting my words exactly the way you like? That’s not worth a gram of horse shit.

            • atomicorange@lemmy.world
              link
              fedilink
              English
              arrow-up
              14
              arrow-down
              4
              ·
              2日前

              I don’t know where you’re getting this “thought crime” stuff. They’re talking about boys distributing deepfake nudes of their classmates. They’re not talking about individuals fantasizing in the privacy of their own homes. You have to read all of the words in the sentences, my friend.

          • FishFace@lemmy.world
            link
            fedilink
            English
            arrow-up
            7
            arrow-down
            9
            ·
            2日前

            If a boy fantasises sexually about a girl, is that harmful to her? If he tells his friends about it? No, this is not harmful - these actions do not affect her in any way. What affects the girl is how the boys might then treat her differently than they would do someone they don’t find sexually attractive.

            The solution, in both cases, has to be to address the harmful behaviour. The only arguments for criminalising deepfakes themselves are also arguments for criminalising sexual fantasies. that is why people are talking about thought crime, because once you criminalise things that are harmless on their own, but which might down the line lead to directly harmful behaviour, there is no other distinction.

            The consent of the individual has been entirely erased. Dehumanization in its most direct form.

            Both of these, for example, apply just as readily to discussing a shared sexual fantasy about someone who didn’t agree to it.

            No distinction, that is, other than this is new and icky. I don’t want government policy to be dictated by fear of the new and by what people find icky, though. I do lots of stuff people find icky.

            • LadyAutumn@lemmy.blahaj.zone
              link
              fedilink
              English
              arrow-up
              15
              arrow-down
              4
              ·
              edit-2
              2日前

              No an image that is shared and distributed is not the same as a fantasy in someone’s head. That is deranged. Should CSAM also be legal because making it illegal is like criminalizing the fantasies of pedophiles? Absolutely insane logical framework you have there.

              This isnt fantasy. It is content. It is media. It is material. It is produced without the consent of the girls and women being sexualized and it commodifies their existence, literally transforming the idea of them into sexual media consumed for the gratification of boys and men.

              It is genuinely incredible to me that you could be so unempathetic, so impassive, so detached from the real world and the consequences of this, that you could even make this comparison. You have seemingly no idea what youre talking about if you believe that pornography is the same thing as mental fantasies.

              And even in the case of mental fantasies, are those all good? Is it really a good thing that boys see the mere existence of the girls around them as inherently some kind of sexual availability?

              • FishFace@lemmy.world
                link
                fedilink
                English
                arrow-up
                9
                arrow-down
                6
                ·
                2日前

                When someone makes child porn they put a child in a sexual situation - which is something that we have amassed a pile of evidence is extremely harmful to the child.

                For all you have said - “without the consent” - “being sexualised” - “commodifies their existence” - you haven’t told us what the harm is. If you think those things are in and of themselves harmful then I need to know more about what you mean because:

                1. if someone thinks of me sexually without my consent I am not harmed
                2. if someone sexualises me in their mind I am not harmed
                3. I don’t know what the “commodification of one’s existence” can actually mean - I can’t buy or sell “the existence of women” (does buying something’s existence mean the same as buying the thing, or something else?) the same I can aluminium, and I don’t see how being able to (easily) make (realistic) nude images of someone changes this in any way

                It is genuinely incredible to me that you could be so unempathetic,

                I am not unempathetic, but I attribute the blame for what makes me feel bad about the situation is that girls are being made to feel bad and ashamed not that a particular technology is now being used in one step of that.

                • LadyAutumn@lemmy.blahaj.zone
                  link
                  fedilink
                  English
                  arrow-up
                  7
                  arrow-down
                  2
                  ·
                  edit-2
                  1日前

                  I am just genuinely speechless than you seemingly do not understand how sickening and invasive it is for your peers to create and share sexual content of you without your consent. Yes its extremely harmful. Its not a matter of feeling ashamed, its a matter of literally feeling like your value to the world is dictated by your role in the sexualities of heterosexual boys and men. It is feeling like your own body doesnt belong to you but can be freely claimed by others. It is losing trust in all your male friends and peers, because it feels like without you knowing they’ve already decided that you’re a sexual experience for them.

                  We do know the harm of this kind of sexualization. Women and girls have been talking about it for generations. This isnt new, just a new streamlined way to spread it. It should be illegal. It should be against the law to turn someone’s images into AI generated pornography. It should also be illegal to share those images with others.

                • atomicorange@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  5
                  arrow-down
                  1
                  ·
                  2日前

                  Are you OK with sexually explicit photos of children taken without their knowledge? They’re not being actively put in a sexual situation if you’re snapping photos with a hidden camera in a locker room, for example. You ok with that?

                  The harm is:

                  • Those photos now exist in the world and can lead to direct harm to the victim by their exposure
                  • it normalizes pedophilia and creates a culture of trading images, leading to more abuse to meet demand for more images
                  • The people sharing those photos learn to treat people like objects for their sexual gratification, ignoring their consent and agency. They are more likely to mistreat people they have learned to objectify.
                  • your body should not be used for the profit or gratification of others without your consent. In my mind this includes taking or using your picture without your consent.
      • rottingleaf@lemmy.world
        link
        fedilink
        English
        arrow-up
        12
        arrow-down
        5
        ·
        2日前

        This definitely will not add in any way to the way women and girls are made to feel entirely disgustingly dehumanized by every man or boy in their lives. Groups of men and boys reducing them and their bodies down to vivid sexual fantasies that they can quickly generate photo realistic images of.

        Sexual attraction doesn’t necessarily involve dehumanization. Unlike most other kinds of interest in a human being, it doesn’t require interest in their personality, but these are logically not the same.

        In general you are using emotional arguments for things that work not through emotion, but through literal interpretation. That’s like using metric calculations for a system that expects imperial. Utterly useless.

        If the person in the image is underaged then it should be classified as child pornography.

        No, it’s not. It’s literally a photorealistic drawing based on a photo (and a dataset to make the generative model). No children have been abused to produce it. Laws work literally.

        If the woman who’s photo is being used hasnt consented to this then it should be classified as sexual exploitation.

        No, because the woman is not being literally sexually exploited. Her photo being used without consent is, I think, subject of some laws. There are no new fundamental legal entities involved.

        Women and girls have faced degrees of this kind of sexual exploitation by men and boys since the latter half of the 20th century. But this is a severe escalation in that behavior. It should be illegal to do this and it should be prosecuted when and where it is found to occur.

        I think I agree. But it’s neither child pornography nor sexual exploitation and can’t be equated to them.

        There are already existing laws for such actions, similar to using a photo of the victim and a pornographic photo, paper, scissors, pencils and glue. Or, if you think the situation is radically different, there should be new punishable crimes introduced.

        Otherwise it’s like punishing everyone caught driving while drunk for non-premeditated murder. One is not the other.

              • rottingleaf@lemmy.world
                link
                fedilink
                English
                arrow-up
                4
                arrow-down
                2
                ·
                1日前

                Suppose I’m a teenager attracted to people my age. Or suppose I’m medically a pedophile, which is not a crime, and then I would need that.

                In any case, for legal and moral purposes “why would you want” should be answered only with “not your concern, go eat shit and die”.

                • Lv_InSaNe_vL@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  21時間前

                  I feel like you didn’t read my comment thoroughly enough. I said it can constitue CSAM. There is a surprising amount of leewat for teenagers of course.

                  But no, I’m not gonna let you get away that easily. I want to know the why you think it’s morally okay for an adult to draw sexually explicit images of children. Please, tell me how that’s okay?

      • General_Effort@lemmy.world
        link
        fedilink
        English
        arrow-up
        12
        arrow-down
        6
        ·
        2日前

        Historically, the respectability of a woman depended on her sexuality. In many conservative cultures and communities, that is still true. Spreading the message that deepfakes are some particular horrible form of harassment reinforces that view.

        If having your head on the model of a nude model is a terrible crime, then what does that say about the nude model? What does it say about women who simply happen to develop a larger bosom or lips? What does it say about sex before marriage?

        The implicit message here is simply harmful to girls and women.

        That doesn’t mean that we should tolerate harassment. But it needs to be understood that we can do no more to stop this kind of harassment than we can do to stop any other kind.

        • LadyAutumn@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          15
          arrow-down
          6
          ·
          2日前

          This is just apologia for the sexual commodification and exploitation of girls and women. There literally is no girl being sexually liberated here, she has literally had the choice taken from her. Sexual liberation does NOT mean “boys and men can turn all women into personal maturation aids”. This ENFORCES patriarchy and subjugation of women. It literally teaches girls that their bodies do not belong to them, that its totally understandable for boys to strip them of humanity itself and turn them into sex dolls.

          • General_Effort@lemmy.world
            link
            fedilink
            English
            arrow-up
            7
            arrow-down
            3
            ·
            2日前

            The most deepfaked women are certainly actresses or musicians; attractive people that appear on screens and are known by much of the population.

            In some countries, they do not allow people to appear on-screen exactly because of that. Or at least, that’s one justification. If the honor or humanity of a woman depends on sexual feelings that she might or might not arouse in men, then women cannot be free. And men probably can’t be free either.

            At no point have I claimed that anyone is being liberated here. I do not know what will happen. I’m just pointing out how your message is harmful.

      • atomicorange@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        6
        ·
        2日前

        Thank you. Focusing on the harm the victims is the right way to understand this issue. Too many people in here hunting for a semantic loophole.

    • lurch (he/him)@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      9
      ·
      edit-2
      2日前

      I hope it might lead to a situation of dirty pics/vids not being a problem for the people in it any more, as it could be a deepfake. Like there were cases where a surfacing dirty pic was used for blackmail, ruined someones career or got them kicked out of some comittee, but since it could be fabrication now, I hope this will beva thing of the past, soon.

      • wewbull@feddit.uk
        link
        fedilink
        English
        arrow-up
        11
        arrow-down
        2
        ·
        2日前

        That could be a socially healthy place to end up at. I don’t see it anytime soon though. Just look at the other response I got.

        • Hemingways_Shotgun@lemmy.ca
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          1
          ·
          2日前

          Sure. That might end up being a socially healthy place for adults to end up.

          But it will never work that way for young teens. Their brains aren’t done baking yet. They don’t have the emotional maturity to understand that enough to be “okay with it because it’s just a fake”.

          That’s why we protect kids rather than just telling them “hey it’s okay…it’s only a fake.”

        • BombOmOm@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          2
          ·
          2日前

          Anyone with half a brain will certainly claim as much. Even if people don’t fully believe it, it will blunt the most serious of social consequences.

    • Hemingways_Shotgun@lemmy.ca
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      5
      ·
      2日前

      I’m not even going to begin describing all the ways that what you just said is fucked up.

      I’ll just point out that online deepfake technology is FAR more accessible to the average 13 year old to use on their peers than “porno mags” were in our day.

      You want to compare taking your 13 year old classmates photo off of Facebook, running it through an AI and in five seconds creating photo-realistic adult content featuring them, and compare that to getting your dad’s skin-mag from under his mattress when he’s not home, cutting your classmates face out of a yearbook, taping it on, then sneaking THAT into the computer lab at school so that you can photocopy it and pass it around in home room, and then putting the skin-mag BACK under the mattress before your dad finds out.

      Is that right…is THAT what you’re trying to say? Are those the two things that you’re trying say are equivalent?

      • SheeEttin@lemmy.zip
        link
        fedilink
        English
        arrow-up
        14
        arrow-down
        1
        ·
        2日前

        Yes, we all know it’s fucked up. The point is that we don’t need a new class of laws just because it’s harassment and bullying ✨with AI✨.

    • SharkAttak@kbin.melroy.org
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      2日前

      Furthermore, we generally assume malicious intent, but I wouldn’t be surprised if teenagers were using the app to ‘get’ big boobs etc., we all have seen those shopped pictures with deformed background 😁

  • RememberTheApollo_@lemmy.world
    link
    fedilink
    English
    arrow-up
    27
    ·
    edit-2
    2日前

    I’m sure the laws will focus on protecting IP - specifically that of AI companies or megacorps, the famous and powerful, but not the small creators of content or the rabble negatively affected by AI abuse.

    The rest of us will have to suffer through presenting whatever damaging and humiliating video to a court. If you can even afford a lawyer to do so. Then be offered a judgement that probably won’t be paid or won’t cover the damage done by an image that will never be able to be erased from the internet. Those damages could include the suicide of young people bullied and humiliated by such deepfakes.

  • FriendFatale@leminal.space
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    5
    ·
    18時間前

    anyone using any kind of AI either doesn’t know how consent works-- or they don’t care about it.

    a horrifying development in the intersection of technofascism and rape culture

      • AstaKask@lemmy.cafe
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        10時間前

        AI models (unless you’re training your own) are usually trained on data it does not have a licence to use. The companies training these models are also notorious for ignoring robot.txt and other measures websites use to stop bots from trawling their data.

        Like in crypto, most people in AI are not nerds, just criminal scum.

          • AstaKask@lemmy.cafe
            link
            fedilink
            English
            arrow-up
            1
            ·
            6時間前

            I am. And so is OC. Neural networks are a different beast, although neither is actual AI. Just a marketing term at this point.

  • electric_nan@lemmy.ml
    link
    fedilink
    English
    arrow-up
    17
    arrow-down
    1
    ·
    2日前

    My mama always told me, that if someone makes a deepfake of you, then you make a deepfake of them right back!

  • danciestlobster@lemmy.zip
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    2
    ·
    2日前

    I don’t understand fully how this technology works, but, if people are using it to create sexual content of underage individuals, doesn’t that mean the LLM would need to have been trained on sexual content of underage individuals? Seems like going after the company and whatever it’s source material is would be the obvious choice here

    • wewbull@feddit.uk
      link
      fedilink
      English
      arrow-up
      1
      ·
      34分前

      You know how when you look at a picture of someone and you cover up the clothed bits, they look naked. Your brain fills in the gaps with what it knows of general human anatomy.

      It’s like that.

    • kayzeekayzee@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      10
      ·
      1日前

      I agree with the other comments, but wanted to add how deepfakes work to show how simple they are, and how much less information they need than LLMs.

      Step 1: Basically you take a bunch of photos and videos of a specific person, and blur their faces out.

      Step 2: This is the hardest step, but still totally feasable for a decent home computer. You train a neural network to un-blur all the faces for that person. Now you have a neural net that’s really good at turning blurry faces into that particular person’s face.

      Step 3: Blur the faces in photos/videos of other people and apply your special neural network. It will turn all the blurry faces into the only face it knows how, often with shockingly realistic results.

      • gkpy@feddit.org
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        1日前

        Cheers for the explanation, had no idea that’s how it works.

        So it’s even worse than @danciestlobster@lemmy.zip thinks, the person creating the deep fake has to have access to CP then if they want to deepfake it!

        • some_guy@lemmy.sdf.org
          link
          fedilink
          English
          arrow-up
          7
          ·
          24時間前

          There are adults with bodies that resemble underage people that could be used to train models. Kitty Yung has a body that would qualify. You don’t necessarily need to use illegal material to train to get illegal output.

        • swelter_spark@reddthat.com
          link
          fedilink
          English
          arrow-up
          4
          ·
          21時間前

          AI can generate images of things that don’t even exist. If it knows what porn looks like and what a child looks like, it can combine those concepts.

        • Vinstaal0@feddit.nl
          link
          fedilink
          English
          arrow-up
          2
          ·
          1日前

          You can probably do it with adult material and replace those faces. It will most likely work on models specific trained like the person you selected.

          People have also put dots on people’s clothing to trick the brain into thinking their are naked, you can probably fill those dots in with the correct body parts if you have a good enough model.

    • lime!@feddit.nu
      link
      fedilink
      English
      arrow-up
      9
      ·
      edit-2
      2日前

      not necessarily. image generation models work on a more fine-grained scale than that. they can seamlessly combine related concepts, like “photograph”+“person”+“small”+“pose” and generate plausible material due to the fact that all of those concepts have features in common.

      you can also use small add-on models trained on very little data (tens to hundreds of images, as compared to millions to billions for a full model) to “steer” the output of a model towards a particular style.

      you can make even a fully legal model output illegal data.

      all that being said, the base dataset that most of the stable diffusion family of models started out with in 2021 is medical in nature so there could very well be bad shit in there. it’s like 12 billion images so it’s hard to check, and even back with stable diffusion 1.0 there was less than a single bit of data in the final model per image in the data.

    • General_Effort@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      2日前

      This is mostly about swapping faces. You take a video and a photo of someone’s face. Software can replace the face of someone in the video with that face. That’s been around for a decade or so. There are other ways of doing it.

      When the face belongs to an underage individual, and the video is pornographic…

      LLMs only do text.