THE SENATE UNANIMOUSLY passed a bipartisan bill to provide recourse to victims of porn deepfakes — or sexually-explicit, non-consensual images created with artificial intelligence.

The legislation, called the Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act — passed in Congress’ upper chamber on Tuesday.  The legislation has been led by Sens. Dick Durbin (D-Ill.) and Lindsey Graham (R-S.C.), as well as Rep. Alexandria Ocasio-Cortez (D-N.Y.) in the House.

The legislation would amend the Violence Against Women Act (VAWA) to allow people to sue those who produce, distribute, or receive the deepfake pornography, if they “knew or recklessly disregarded” the fact that the victim did not consent to those images.

  • randon31415@lemmy.world
    link
    fedilink
    arrow-up
    10
    arrow-down
    13
    ·
    5 months ago

    There are billions of people. Find the right one, and a “reasonable person” could not tell the difference.

    Image a law that said you cannot name your baby a name if someone else’s name starts with the same letter. After 26 names, all future names would be illegal. The law essentially would make naming babies illegal.

    The “alphabet” in this case is the distict visual depiction of all people. As long as the visual innumeration of “reasonable people” is small enough, it essentially makes any generation illegal. Conversely, if “reasonable people” granulated fine enough, it makes avoiding prosecution trivial by adding minor conflicting details.

    • Flying Squid@lemmy.world
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      5 months ago

      “The right one” according to whom? There are two sides to a court case. The opposition can find all kinds of ways to show that person is not reasonable since they can’t recognize a very good simulation of someone’s face, just like they can show someone who is shortsighted didn’t see the car crash like they said they did.