THE SENATE UNANIMOUSLY passed a bipartisan bill to provide recourse to victims of porn deepfakes — or sexually-explicit, non-consensual images created with artificial intelligence.

The legislation, called the Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act — passed in Congress’ upper chamber on Tuesday.  The legislation has been led by Sens. Dick Durbin (D-Ill.) and Lindsey Graham (R-S.C.), as well as Rep. Alexandria Ocasio-Cortez (D-N.Y.) in the House.

The legislation would amend the Violence Against Women Act (VAWA) to allow people to sue those who produce, distribute, or receive the deepfake pornography, if they “knew or recklessly disregarded” the fact that the victim did not consent to those images.

  • homura1650@lemm.ee
    link
    fedilink
    arrow-up
    33
    ·
    5 months ago

    The bill: https://www.congress.gov/118/bills/s3696/BILLS-118s3696es.xml

    As always, I read the bill expecting to be deeply disappointed; but was pleasantly surprised with this one. It’s not going to solve the issue, but I don’t really know of anything they can do to solve it. My guess is this will mostly be effective at going after large scale abuses (such as websites dedicated to deepfake porn, or general purpose deepfake sites with no safeguards in place).

    My first impressions on specific parts of the bill:

    1. The bill is written as an amendment to the 2022 appropriations act. This isn’t that strange, but I haven’t actually cross-references that, so might be misunderstanding some subtlety.

    2. The definition of digital forgery is broad in terms of the means. Basically anything done on a computer counts, not just AI. In contrast, it is narrow in the result, requiring that:

    when viewed as a whole by a reasonable person, is indistinguishable from an authentic visual depiction of the individual.

    There is a lot of objectionable material that is not covered by this. Personally, I would like to see a broader test, but can’t think of any that I would be comfortable with

    1. The depiction also needs to be relevant to interstate or foreign commerce. There hands are tied by the constitution on this one. Unless Wickard v Fillburn us overturned though, me producing a deepfake for personal use reduces my interstate porn consumption, so it qualifies. By explicitly incorporating the constitutional test, the law will survive any change made to what qualifies as interstate commerce.

    2. The mens rea required is “person who knows or recklessly disregards that the identifiable individual has not consented to such disclosure” No complaints on this standard.

    3. This is grounds for civil suits only; nothing criminal. Makes sense, as criminal would normally be a state issue and, as mentioned earlier, this seems mostly targeted at large scale operations, which can be prevented with enough civil litigation.

    4. Max damage is:

      • $150k
      • Unless it can be linked to an actual or attempted sexual assult, stalking or harassment, in which case it increases to $250k
      • Or you can sue for actual damages (including any profits made as a result of the deepfake)
    5. Plaintifs can use a pseudonym, and all personally identifiable information is to be redacted or filed under seal. Intimate images turned over in discovery remains in the custody of the court

    6. 10 year statute of limitations. Starting at when the plaintif could reasonably have learned about the images, or turns 18.

    7. States remain free to create their own laws that are “at least as protective of the rights of a victim”.

    My guess is the “at least as protective” portion is there because a state suite would prevent a federal suit under this law, as there is an explicit bar on duplicative recovery, but I have not dug into the referenced law to see what that covers.