An app developer who created an algorithm that can digitally undress women in photos has pulled the plug on the software after high traffic and a viral backlash convinced him that "the world is not ready" for it, "yet."

DeepNude used artificial intelligence to create the "deepfake" images, presenting realistic approximations of what a woman -- it was not designed to work on men -- might look like without her clothes. Deepfake photos and videos often appear credible to the average viewer, prompting concerns by researchers and lawmakers about their potential to mislead the public, especially in the run-up to the 2020 election.

Last month, a doctored clip of House Speaker Nancy Pelosi, D-Calif., that had been altered to make her slur her words went viral, drawing attention to how even poorly made videos can be used to spread political disinformation at alarming speeds.

"Deep-fake technologies will enable the creation of highly realistic and difficult to debunk fake audio and video content," Danielle Citron, a law professor at the University of Maryland, testified before a House Committee on the dangers of deepfakes earlier this month. "Soon, it will be easy to depict someone doing or saying something that person never did or said. Soon, it will be hard to debunk digital impersonations in time to prevent significant damage."

Though much has been made of the technology's threat to national security, it's also been harnessed to make a torrent of fake porn, including widely circulated videos of celebrities like Gal Gadot and Scarlett Johansson. And though sites like Reddit, Twitter and Pornhub have tried to ban pornographic deepfakes, they've had limited success. The technology is cheap and easily accessible, and the chances for use are limitless.

WDAY logo
listen live
watch live

The free version of DeepNude placed a large watermark on images it generated. But the $50 version just slapped a small stamp that said "FAKE" in the upper left corner of the pictures. As Motherboard noted, it could easily be cropped out.

Though pornographic deepfake images don't technically count as revenge porn, since they aren't actually images of the real women's bodies, they're still capable of causing psychological damage. California is considering a bill that would criminalize pornographic deepfakes.

This article was written by Taylor Telford, a reporter for The Washington Post.