Two abstract versions of a person
Credit: iStock / Getty Images Plus

Victims of deepfake videos are reclaiming control of their images by utilizing copyright laws, as revealed in a recent investigation.

An analysis conducted by WIRED highlighted numerous women, including streamers, gamers, and popular content creators, filing copyright complaints to Google to have nonconsensual digitally-altered videos removed from various websites.

The investigation identified over 13,000 copyright claims, involving nearly 30,000 URLs, against multiple sites indexed on Google.

SEE ALSO: SXSW 2024: 3 WTF tech products, including an AI Marilyn Monroe

Individuals affected by nonconsensual deepfake content are leveraging the Digital Media Copyright Act (DMCA), commonly used to combat the unauthorized sharing of music, videos, and other media online. The DMCA has previously been instrumental in cases involving image-based sexual abuse or “revenge porn,” emphasizing the ownership of personal images by the victims.

Due to deepfake creators altering or fabricating original images, victims face a significant challenge in providing evidence to assert their intellectual property rights.

Google has taken steps to address the issue of revenge porn and deepfakes by introducing new policies and procedures for reporting, including mechanisms to eliminate explicit personal images from search results and systems to identify original and replicated deepfake content. Google’s efforts have resulted in the removal of approximately 82% of reported URLs. WIRED noted, “For the largest deepfake video website, Google has received takedown requests for 12,600 URLs, and 88% have been removed.”

The substantial number of violations has raised concerns among online safety advocates and copyright experts about why such websites are allowed to persist despite multiple takedown requests. Dan Purcell, the founder and CEO of Ceartas, a privacy protection firm, questioned the rationale behind not completely removing websites with multiple infringement notices. He stated in an interview with WIRED, “If you remove 12,000 links for infringement, why are they not just completely removed? They should not be crawled. They’re of no public interest.”

SEE ALSO: What to do if someone makes a deepfake of you

While awaiting government legislation to criminalize the dissemination of “sexualized digital forgeries,” victims have found a legal recourse through the copyright strategy.

The proposed DEFIANCE (Disrupt Explicit Forged Images and Non-Consensual Edits) Act not only aims to criminalize the distribution of deepfake content but also provides victims with the ability to sue creators who use their likeness without permission.

Congresswoman Alexandria Ocasio-Cortez emphasized the urgency for federal legislation to hold perpetrators of nonconsensual deepfake content accountable, particularly as the prevalence of such content continues to rise.

The growing concern over deepfake technologies has led to a coalition of AI experts, researchers, artists, and politicians advocating for comprehensive legislation to address the spread of harmful deepfakes, including child pornography. The coalition’s efforts call for criminal penalties for individuals involved in creating or disseminating damaging deepfakes and imposing obligations on software developers and distributors.

The coalition highlighted the inadequacy of current laws in combating deepfakes and the alarming surge in the use of this technology. They noted, “Unprecedented AI progress is making deepfake creation fast, cheap, and easy, with deepfake numbers growing by 550% from 2019 to 2023.”

Although high-profile individuals like Taylor Swift and actor Jenna Ortega have been targets of explicit deepfake content, the issue extends to ordinary individuals, especially impacting young children and teenagers. The proliferation of deepfake images into the social sphere has prompted child safety experts to advocate for preventive measures and parental vigilance.

Instances where minors misuse deepfake technology to create and distribute inappropriate content have raised concerns about legislative gaps in addressing such actions.

Noting the traumatic repercussions of nonconsensual deepfake content, the Sexual Violence Prevention Association (SVPA) founder Omny Miranda Martone expressed support for the DEFIANCE Act, outlining the various risks faced by victims, including stalking, employment repercussions, and emotional trauma.

If you have been a victim of unauthorized image sharing, contact the Cyber Civil Rights Initiative’s 24/7 hotline at 844-878-2274 for confidential support. The CCRI website provides additional helpful information and lists international resources.

Topics Artificial Intelligence Google Social Good

Chase sits in front of a green framed window, wearing a cheetah print shirt and looking to her right. On the window
Chase DiBenedetto
Social Good Reporter

Chase joined Mashable’s Social Good team in 2020, covering online stories about digital activism, climate justice, accessibility, and media representation. Her work also touches on how these conversations manifest in politics, popular culture, and fandom. Sometimes she’s very funny.


Leave a Reply

Your email address will not be published. Required fields are marked *