After a social media uproar over its potential for abuse, the creators of an application allowing users to virtually “undress” women using artificial intelligence have shut it down.
The creators of “DeepNude” said the software was launched several months ago for “entertainment” and that they “greatly underestimated” demand for the app.
“We never thought it would be viral and (that) we would not be able to control the traffic,” the DeepNude creators, who listed their location as Estonia, said on Twitter.
“Despite the safety measures adopted (watermarks), if 500,000 people use it, the probability that people will misuse it is too high. We don’t want to make money this way.”
Articles in The Washington Post, Vice and other media showed how the app could be used to take a photo of a clothed woman and transform that into a nude image, sparking outrage and renewed debate over nonconsensual pornography.
“This is a horrifically destructive invention and we hope to see you soon suffer consequences for your actions,” tweeted the Cyber Civil Rights Initiative, a group that seeks protection against non consensual and “revenge” porn.
Mary Anne Franks, a law professor and president of the CCRI, tweeted later, “It’s good that it’s been shut down, but this reasoning makes no sense. The app’s INTENDED USE was to indulge the predatory and grotesque sexual fantasies of pathetic men.”
DeepNude offered a free version of the application as well as a paid version, and was the latest in a trend of “deepfake” technology that can be used to deceive or manipulate.
Although the app was shut down, critics expressed concern that some versions of the software remained available and would be abused.