The Senate passed a bill that would give individuals who’ve discovered their likeness deepfaked into sexually-explicit photos with out their consent a brand new method to combat again.
The Disrupt Specific Solid Pictures and Non-Consensual Edits Act (DEFIANCE Act), would let victims sue the people who created the photographs for civil damages. The invoice handed with unanimous consent — which means there was no roll-call vote, and no Senator objected to its passage on the ground Tuesday. It’s meant to construct on the work of the Take It Down Act, a regulation that criminalizes the distribution of nonconsensual intimate photos (NCII) and requires social media platforms to promptly take away them.
Senate Democratic Whip Dick Durbin (D-IL), a lead sponsor of the invoice, referenced Grok’s nonconsensual undressing in remarks on the Senate ground. “Even after these horrible deepfake, harming photos are identified to Grok and to X, previously Twitter, they don’t reply. They don’t take the photographs off of the web. They don’t come to the rescue of people who find themselves victims,” Durbin stated. Although the Take It Down Act, whose takedown provision goes into full drive later this yr, might have implications for X, the DEFIANCE Act would influence people, like these Grok customers creating deepfaked nonconsensual intimate imagery.
Governments all over the world are creating new protections towards AI-generated nonconsensual photos, spurred partially by the latest Grok controversy. The UK, for instance, not too long ago pushed up a law that criminalizes the creation of nonconsensual intimate deepfakes.
The DEFIANCE Act equally passed the Senate in 2024 following a different nonconsensual deepfake scandal on X. Early that yr, sexually express AI-generated photos of Taylor Swift circulated on the platform. Durbin together with Sens. Lindsey Graham (R-SC), Amy Klobuchar (D-MN), and Josh Hawley (R-MO) introduced the bill to increase on a provision within the Violence Towards Girls Act Reauthorization Act of 2022, which gave folks whose non-AI generated intimate photos have been shared with out consent a proper to sue. Rep. Alexandria Ocasio-Cortez (D-NY), who has found her own image digitally altered in nonconsensual intimate deepfakes, sponsored the invoice within the Home. The invoice stalled within the Home and not using a vote over the last Congress, requiring the Senate to take it up once more this yr. Now the ball is once more within the Home management’s courtroom; in the event that they determine to convey the invoice to the ground, it should go to be able to attain the president’s desk.