close
close

Gottagopestcontrol

Trusted News & Timely Insights

San Francisco cracks down on websites that create AI deepfakes of nude images of women and girls
Duluth

San Francisco cracks down on websites that create AI deepfakes of nude images of women and girls

Nearly a year after nude photos of high school students created with the help of artificial intelligence turned a community in southern Spain upside down, a juvenile court sentenced 15 of their classmates to a year’s probation this summer.

But the artificial intelligence tool used to create the malicious deepfakes is still easily accessible on the internet and promises to “unmask any photo uploaded to the site within seconds.”

There is currently a new attempt to ban these and similar apps in California. The state of San Francisco filed the first lawsuit of its kind this week, which experts say could set a precedent but will also face many hurdles.

“The proliferation of these images has exploited a horrifying number of women and girls around the world,” said David Chiu, San Francisco’s elected city attorney, who filed suit against a group of highly trafficked websites based in Estonia, Serbia, Britain and elsewhere.

“These images are used to harass, humiliate and threaten women and girls,” he said in an interview with the Associated Press. “And the impact on victims is devastating: their reputation, their mental health, the loss of their autonomy, and in some cases they have even triggered suicidal thoughts.”

The lawsuit, filed on behalf of the people of California, alleges that the services violated numerous state laws covering deceptive business practices, non-consensual pornography and child sexual abuse. But it can be difficult to find out who runs the apps, which aren’t available in phone app stores but are still easily found online.

One service contacted by the AP late last year claimed via email that its “CEO is based in the U.S. and travels all over the U.S.,” but declined to provide any evidence or answer other questions. The AP is not naming the specific apps being sued to avoid promoting them.

“There are a number of websites where we don’t know exactly who those operators are and where they’re operating from right now, but we have investigative tools and the authority to issue subpoenas to investigate,” Chiu said. “And we will certainly exercise our authority as this litigation progresses.”

Many of these tools are used to create realistic fakes that “nude” photos of clothed adult women, including celebrities, without their consent. But they have also appeared in schools around the world, from Australia to Beverly Hills, California. Typically, boys create images of female classmates that are then widely shared on social media.

In one of the first cases to attract widespread attention in the Spanish town of Almendralejo last September, a doctor whose daughter was among a group of girls who were victims of assault last year and who helped raise public awareness of the incidents said she was satisfied with the severity of the sentence her classmates face following a court ruling earlier this summer.

But it is “not only the responsibility of society, the education system, parents and schools, but also the responsibility of the digital giants who profit from all this garbage,” said Dr. Miriam al Adib Mendiri in an interview on Friday.

She welcomed San Francisco’s action but said more efforts were needed, including from larger companies such as California-based Meta Platforms and its subsidiary WhatsApp, which used the images to spread in Spain.

While schools and law enforcement agencies try to punish those who create and share deepfakes, authorities are struggling to deal with the tools themselves.

In January, the European Union’s executive said in a letter to a Spanish member of the European Parliament that the app used in Almendralejo “does not appear” to fall under the bloc.

Organizations monitoring the rise of AI-generated child sexual abuse material will be closely following the San Francisco case.

The lawsuit “has the potential to set a precedent in this area,” said Emily Slifer, policy director at Thorn, an organization that campaigns against the sexual exploitation of children.

A Stanford University researcher said it would be more difficult to bring the defendants to justice because so many of them are based outside the United States.

Chiu “has a tough time in this case, but may be able to get some of the websites taken down if the defendants who operate them ignore the lawsuit,” said Riana Pfefferkorn of Stanford.

She said that could happen if the city wins the case by default judgment in the absence of the plaintiffs and obtains injunctions affecting domain name registrars, web hosts and payment processors “that would effectively shut down those sites even if their owners never appear in the litigation.”

LEAVE A RESPONSE

Your email address will not be published. Required fields are marked *