Crime & Safety

San Francisco City Attorney Sues Websites Creating AI-Generated Deepfake Pornography

The city attorney's office says the websites targeted in the lawsuit have been visited more than 200 million times in the last six months.

(CBS Bay Area)

August 16, 2024

The San Francisco City Attorney's office is suing companies that create "deepfake nudes," where artificial intelligence is used to turn photos of adults and children into pornography.

Find out what's happening in San Franciscowith free, real-time updates from Patch.

On Thursday morning, City Attorney David Chiu announced a first-of-its-kind lawsuit against 16 of the most visited websites creating AI-generated nonconsensual explicit images, often of women and girls.

The websites offer users the opportunity to upload clothed images of real people to create realistic-looking nude images, usually for a fee. While some of the websites allow users to only upload images of adults, Chiu said other sites allow users to create nonconsensual pornographic images of children.

Find out what's happening in San Franciscowith free, real-time updates from Patch.

Click here for the full story via CBS San Francisco


CBS Local Digital Media personalizes the global reach of CBS-owned and operated television and radio stations with a local perspective.

To request removal of your name from an arrest report, submit these required items to [email protected].