San Francisco City Attorney sues websites creating AI-generated deepfake pornography
The San Francisco City Attorney's office is suing companies that create "deepfake nudes," where artificial intelligence is used to turn photos of adults and children into pornography.
On Thursday morning, City Attorney David Chiu announced a first-of-its-kind lawsuit against 16 of the most visited websites creating AI-generated nonconsensual explicit images, often of women and girls.
The websites offer users the opportunity to upload clothed images of real people to create realistic-looking nude images, usually for a fee. While some of the websites allow users to only upload images of adults, Chiu said other sites allow users to create nonconsensual pornographic images of children.
According to the city attorney's office, the websites targeted in the lawsuit have reportedly been visited more than 200 million times in the first six months of this year.
"We all need to do our part to crack down on bad actors using AI to exploit and abuse real people, including children," Chiu said at a press conference late Thursday morning.
The lawsuit comes amid a troubling trend in schools, where students have used the technology to superimpose their classmates' faces onto photos of nude bodies. One such incident involved students at a middle school in Southern California earlier this year.
Celebrities have also been victimized by AI-generated explicit images, including Taylor Swift.
Chiu's lawsuit alleges violations of state and federal laws prohibiting deepfake pornography, revenge pornography, and child pornography, as well as violations of the state's Unfair Competition Law. The lawsuit seeks to take down the sites, along with civil penalties.
Anyone who may have been the victim of nonconsensual deepfake pornography is urged to contact the San Francisco City Attorney's office through the agency's consumer complaint web portal or by calling 415-554-3977.