THE SENATE UNANIMOUSLY passed a bipartisan bill to provide recourse to victims of porn deepfakes — or sexually-explicit, non-consensual images created with artificial intelligence.

The legislation, called the Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act — passed in Congress’ upper chamber on Tuesday.  The legislation has been led by Sens. Dick Durbin (D-Ill.) and Lindsey Graham (R-S.C.), as well as Rep. Alexandria Ocasio-Cortez (D-N.Y.) in the House.

The legislation would amend the Violence Against Women Act (VAWA) to allow people to sue those who produce, distribute, or receive the deepfake pornography, if they “knew or recklessly disregarded” the fact that the victim did not consent to those images.

  • Cosmos7349@lemmy.world
    link
    fedilink
    arrow-up
    6
    ·
    5 months ago

    So I haven’t dug deeply into the actual legislation, so someone correct me if I’m misinformed… but my understanding is that this isn’t necessarily trying to raise the bar for using the technology as much as much as trying to make clearer legal guidelines for victims to have legal recourse. If we were to relate it to other weapons, it’s like creating the law “it’s illegal to shoot someone with a gun”.

    • j4k3@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      5 months ago

      I have not dug deeply either, but have noticed that Civitai has shifted their wording and hosting in ways that indicated a change was coming. In practice, the changes will come from the model hosting sites for open source tools limiting their liability and not hosting content related to real humans.

      My main concern is the stupid public reacting to some right wing fake and lacking appropriate skepticism. Like expecting detection tools to be magical and understanding the full spectrum of possibilities.