Webp 3r2l9nmmbri3huekmox6348shtyh
Alexandra Reeve Givens President & CEO at Center for Democracy & Technology | Official website

Lawmakers introduce bill targeting nonconsensual deepfake imagery

ORGANIZATIONS IN THIS STORY

On June 18, 2024, Senators Ted Cruz and Amy Klobuchar introduced the Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks (TAKE IT DOWN) Act. This legislative proposal aims to support victims of nonconsensual distribution of intimate images (NDII), whether these images are real or generated through artificial intelligence.

The issue of NDII is not new. Organizations like the Cyber Civil Rights Initiative have long advocated for policies to combat image-based abuse. The SHIELD Act, initially introduced by then-Senator Kamala Harris in 2019, sought to criminalize such nonconsensual distribution. The emergence of generative AI technologies has heightened concerns regarding NDII.

Earlier this year, the Senate passed the DEFIANCE Act, which extends federal civil cause of action for NDII to include synthetic versions. Recently, the Senate also passed the SHIELD Act, criminalizing nonconsensual distribution of intimate visual depictions with penalties up to three years for depictions involving children.

Senators Cruz and Klobuchar presented the TAKE IT DOWN Act amid growing victimization through AI-generated NDII. "Upwards of 90% of AI generated or manipulated videos online are sexually explicit," highlighting a pressing issue that affects both adults and minors. CDT research indicates significant awareness among high school students about AI-generated NDII incidents.

The TAKE IT DOWN Act modifies existing legislation by incorporating penalties for "deepfakes" that realistically depict individuals. It proposes a takedown system similar to the Digital Millennium Copyright Act (DMCA), enforced by the FTC, allowing victims to request removal of their images from platforms.

Currently, victims can use DMCA mechanisms if they own copyright over an image. However, this does not apply when images are AI-generated or lack clear ownership. The TAKE IT DOWN Act seeks to address this gap by enabling takedown requests regardless of copyright status.

Despite its intentions, critics argue that the bill has flaws that could threaten free expression and privacy. Unlike the DEFIANCE Act, which covers "digital forgeries," TAKE IT DOWN focuses narrowly on realistic AI depictions that might miss other harmful instances.

The bill excludes certain self-curated sites from its provisions and raises questions about its interaction with Section 230 — a law shielding online platforms from liability for user content. Additionally, it mandates rapid removal processes potentially conflicting with First Amendment rights without court adjudication.

Privacy concerns arise as well since the act might inadvertently require monitoring encrypted services where users expect privacy. The legislation's scope needs clarification to avoid obligations that could compromise encryption security.

In conclusion, while aiming to empower victims against NDII violations, revisions are necessary for TAKE IT DOWN to be effective and constitutionally sound while protecting privacy rights.

ORGANIZATIONS IN THIS STORY