Skip to content
FILE - David Chiu, the San Francisco city attorney, at City Hall on Oct. 30, 2023. Chiu has filed a lawsuit seeking to permanently shutter 16 popular websites that turn images of real girls and women into pornography. (Gabriela Hasbun/The New York Times)
FILE – David Chiu, the San Francisco city attorney, at City Hall on Oct. 30, 2023. Chiu has filed a lawsuit seeking to permanently shutter 16 popular websites that turn images of real girls and women into pornography. (Gabriela Hasbun/The New York Times)

SAN FRANCISCO — Like many parents, Yvonne Meré was deeply disturbed when she read about a frightening new trend.

Boys were using “nudification” apps to turn photos of their female classmates into deepfake pornography, using images of the girls’ faces, from photos in which they were fully clothed, and superimposing them onto images of naked bodies generated by artificial intelligence.

But unlike many parents who worry about the threats posed to their children in a world of ever-changing technology, Meré, the mother of a 16-year-old girl, had the power to do something about it. As the chief deputy city attorney in San Francisco, Meré rallied her co-workers to craft a lawsuit, filed in state court Wednesday night, that seeks to shut down the 16 most popular websites used to create these deepfakes.

The legal team said it appeared to be the first government lawsuit of its kind aimed at quashing the sites that promote the opportunity to digitally “undress” women and girls without their consent.

After reading a New York Times article about the tremendous damage done when such deepfake images are created and shared, Meré texted Sara Eisenberg, the mother of a 9-year-old girl and the head of the unit in the city attorney’s office that identifies major social problems and tries to solve them through legal action. The two of them then reached out to the office’s top lawyer, City Attorney David Chiu.

“The article is flying around our office, and we were like, ‘What can we do about this?’” Chiu recalled in an interview. “No one has tried to hold these companies accountable.”

Several states have enacted measures criminalizing AI-generated sexually explicit depictions of minors, but Chiu said that requires going after the people creating and distributing the images, one by one. The new lawsuit out of San Francisco asks a judge to order the sites used to create the content to shut down altogether.

Chiu acknowledged that this strategy could be viewed as a Whac-a-Mole approach, since more sites could crop up. But the suit proposes to add more sites as the office learns about them.

The 16 sites targeted in the lawsuit were visited a combined 200 million times in the first six months of this year, he said. The entities behind the sites include individuals and companies in California, New Mexico, the United Kingdom and Estonia. Representatives of the websites either could not be reached or did not respond to requests for comment.

One site promotes its services by asking, “Have someone to undress?” Another reads, “Imagine wasting time taking her out on dates,” when users can, it says, use the website “to get her nudes.” Some of the websites allow users to create images for free before charging for more images — usually using cryptocurrency, but sometimes credit cards.

The sites’ AI models have been trained using real pornography and images depicting child abuse to create the deepfakes, Chiu said. In mere seconds, the sites can make authentic-looking images of breasts and genitalia under real faces.

The technology has been used to create deepfake nudes of everyone from Taylor Swift to ordinary middle-school girls with few apparent repercussions. The images are sometimes used to extort victims for money or humiliate and harass them. Experts have warned that they can harm the victims’ mental health, reputations and physical safety, and damage their college and job prospects.

Yet it is not a problem that can be tackled simply by having conversations with teenagers about smart technology usage, since any photo of them, including prom and sports photos, can be snatched and manipulated without their consent.

“You can be as internet-savvy and social media-savvy as you want, and you can teach your kids all the ways to protect themselves online, but none of that can protect them from somebody using these sites to do really awful, harmful things,” Eisenberg said.

Once the images are circulating, it is nearly impossible to determine which website created them, making it very difficult for the women to successfully sue the companies, Chiu said.

Instead, the lawsuit seeks to shutter the sites and permanently restrain those operating them from creating deepfake pornography in the future, and assess civil penalties and attorneys’ fees. On the question of jurisdiction, the suit argues that the sites violate state and federal revenge-pornography laws; state and federal child-pornography laws; and the California Unfair Competition Law, which prohibits unlawful and unfair business practices.

San Francisco is a fitting venue, the lawyers argued, as it is ground zero for the growing artificial intelligence industry. Already, people in the city can order driverless vehicles from their phones to whisk them around town, and the industry’s leaders, including OpenAI and Anthropic, are based there.

Chiu says he thinks the industry has largely had a positive effect on society, but the issue of deepfake pornography has highlighted one of its “dark sides.”

Keeping pace with the rapidly changing industry as a government lawyer is daunting, he said. “But that doesn’t meant we shouldn’t try.”

This article originally appeared in The New York Times.

Get more business news by signing up for our Economy Now newsletter.