Google has had a longstanding ban on sexually explicit ads — but until now, the company hasn’t banned advertisers from promoting services that people can use to make deepfake porn and other forms of generated nudes. That’s about to change.

Google currently prohibits advertisers from promoting “sexually explicit content,” which Google defines as “text, image, audio, or video of graphic sexual acts intended to arouse.” The new policy now bans the advertisement of services that help users create that type of content as well, whether by altering a person’s image or generating a new one.

The change, which will go into effect on May 30th, prohibits “promoting synthetic content that has been altered or generated to be sexually explicit or contain nudity,” such as websites and apps that instruct people on how to create deepfake porn.

“This update is to explicitly prohibit advertisements for services that offer to create deepfake pornography or synthetic nude content,” Google spokesperson Michael Aciman tells The Verge.

Aciman says any ads that violate its policies will be removed, adding that the company uses a combination of human reviews and automated systems to enforce those policies. In 2023, Google removed over 1.8 billion ads for violating its policies on sexual content, according to the company’s annual Ads Safety Report. 

The change was first reported by 404 Media. As 404 notes, while Google already prohibited advertisers from promoting sexually explicit content, some apps that facilitate the creation of deepfake pornography have gotten around this by advertising themselves as non-sexual on Google ads or in the Google Play store. For example, one face swapping app didn’t advertise itself as sexually explicit on the Google Play store but did so on porn sites. 

Nonconsensual deepfake pornography has become a consistent problem in recent years. Two Florida middle schoolers were arrested last December for allegedly creating AI-generated nude photos of their classmates. Just this week, a 57-year-old Pittsburgh man was sentenced to more than 14 years in prison for possessing deepfake child sexual abuse material. Last year, the FBI issued an advisory about an “uptick” in extortion schemes that involved blackmailing people with AI-generated nudes. While many AI models make it difficult — if not impossible — for users to create AI-generated nudes, some services let users generate sexual content.

There may soon be legislative action on deepfake porn. Last month, the House and Senate introduced the DEFIANCE Act, which would establish a process through which victims of “digital forgery” could sue people who make or distribute nonconsensual deepfakes of them.

Share.
Exit mobile version