THE SENATE UNANIMOUSLY passed a bipartisan bill to provide recourse to victims of porn deepfakes — or sexually-explicit, non-consensual images created with artificial intelligence.

The legislation, called the Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act — passed in Congress’ upper chamber on Tuesday.  The legislation has been led by Sens. Dick Durbin (D-Ill.) and Lindsey Graham (R-S.C.), as well as Rep. Alexandria Ocasio-Cortez (D-N.Y.) in the House.

The legislation would amend the Violence Against Women Act (VAWA) to allow people to sue those who produce, distribute, or receive the deepfake pornography, if they “knew or recklessly disregarded” the fact that the victim did not consent to those images.

  • ArbitraryValue@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    22
    arrow-down
    1
    ·
    3 months ago

    distribute, or receive the deepfake pornography

    Does this make deepfake pornography more restricted than real sexual images either created or publicly released without consent?

    • drislands@lemmy.world
      link
      fedilink
      arrow-up
      26
      arrow-down
      1
      ·
      3 months ago

      I think so. In a way, it makes sense – a lot of people are of the (shitty) opinion that if you take lewd pictures of yourself, it’s your fault if they’re disseminated. With lewd deepfakes, there’s less opportunity for victim blaming, and therefore wider support.

      • ArbitraryValue@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        3 months ago

        Maybe, but my understanding is that it’s legal to have photos of someone actually being the victim of a sex crime, not just photos of consensual acts.