THE SENATE UNANIMOUSLY passed a bipartisan bill to provide recourse to victims of porn deepfakes — or sexually-explicit, non-consensual images created with artificial intelligence.

The legislation, called the Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act — passed in Congress’ upper chamber on Tuesday.  The legislation has been led by Sens. Dick Durbin (D-Ill.) and Lindsey Graham (R-S.C.), as well as Rep. Alexandria Ocasio-Cortez (D-N.Y.) in the House.

The legislation would amend the Violence Against Women Act (VAWA) to allow people to sue those who produce, distribute, or receive the deepfake pornography, if they “knew or recklessly disregarded” the fact that the victim did not consent to those images.

    • Todd Bonzalez@lemm.ee
      link
      fedilink
      arrow-up
      6
      ·
      edit-2
      5 months ago

      Okay, and we already have laws against nonconsensual pornography. But here’s the thing: those laws could easily be argued to only count for actual photographs or videos, in that many of these laws were written to tackle “revenge porn” and not “fake porn”, so someone making nonconsensual porn with AI could get charges dropped on the grounds that the porn wasn’t legitimate, or didn’t meet the definition of pornography starring the victim as defined by existing laws (because those laws were written before AI made this a mainstream problem).

      Making laws that explicitly list things like photoshop and AI faceswap or “roop” tools that can create artificial nonconsensual pornography lays the groundwork for consequences for people who sexually abuse others in this way.

      And let’s be realistic, if you use AI to make nonconsensual porn for personal purposes, you may be a creep, but you also aren’t going to get caught if you keep it to yourself. You also aren’t going to harm the person you’re making porn of if you’re not distributing the content you’re creating. You still shouldn’t do it, but it’s practically legal if you simply keep your weird shit to yourself and don’t publicly victimize someone with it. The people who will get caught are going to be in the latter category, and I have no problem seeing the book thrown at them.

      • postmateDumbass@lemmy.world
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        5 months ago

        Just as you point out revenge porn laws have a problem because they miss the mark covering fake porn, outlawing the methods amd mediums of art ued today is also going to miss the mark. Aside from the oppression issues, new methods will come along and circumvemt the targeted laws.

        Just need to write laws that target the crime, here it would seem to be a combination of defamation, privacy, and assault in play for using sexuality or nudity to harm an individual and/or their reputation. Without harm, what crime is there?