DeepNude AI Photo App Taken Down by Makers
“The world is not yet ready for DeepNude” says app creators after pulling the controversial tool offline.
DeepNude, an app which can remove the clothes from women in photos automatically, has been taken offline by its anonymous Estonia-based developers following an article highlighting the tool by Vice.
The creators say the app was only meant for “users’ entertainment” and that they had not anticipated that it would go viral. Able to transform any photo of a woman into a realistic nude image, the app has stoked the fierce debate over non-consensual pornography and the objectification of women.
According to Vice, the app only focused on undressing women and would not unclothe men. Anti revenge porn campaigners, the Cyber Civil Rights Initiative (CCRI), said of the app “This is a horrifically destructive invention and we hope to see you soon suffer consequences for your actions.”
- OpenAI Won’t Release AI Text Generator, Branding it Too Dangerous
- Artificial Intelligence: Ethics, Regulation and Public Perception
- Online Sex Crimes Against Children in the UK Have Doubled in Four Years
Initially, it was taken offline when DeepNude’s servers were overwhelmed by users trying to download the $50 Windows and Linux desktop app. However, amid public outrage, the creators have opted not to restore the tool saying that “Despite the safety measures adopted (watermarks) if 500,000 people use it, the probability that people will misuse it is too high. The world is not yet ready for DeepNude”.
Law professor and president of the CCRI, Mary Anne Franks, tweeted: “It’s good that it’s been shut down, but this reasoning makes no sense. The app’s INTENDED USE was to indulge the predatory and grotesque sexual fantasies of pathetic men.”
Danielle Citron, professor of law at the University of Maryland Carey School of Law, who recently testified to Congress about the deepfake threat, described the tool as “terrifying” and an “invasion of sexual privacy”.
Speaking to Motherboard she said, “Now anyone could find themselves a victim of revenge porn, without ever having taken a nude photo. This tech should not be available to the public.”
Unlike other Deepfake videos and images, which take can hours to render by an expert editor, Deepnude is a consumer-facing app and is able to digital manipulate the images of women within seconds at the click of a button.
Both the free and premium versions of the app are no longer available for download, and those who have paid for the premium version but have yet to activate it will not be granted access and will be offered a refund.
Although the developers have taken measures to stop the proliferation of the tool and urged users not to share it working copies of the software are still out there.
CCRI tweeted, “The #Deepnude app is out there now and will be used, despite the creator taking it off the market. If only there were a way to disable all the versions out there”.