Tech

Copies of AI deepfake app DeepNude are easily accessible online — and always will be

Your ads will be inserted here by

Easy Plugin for AdSense.

Please go to the plugin admin page to
Paste your ad code OR
Suppress this ad slot.

Once something has been shared online, it never truly goes away. This adage is particularly relevant for DeepNude, software that uses AI to create fake nude images of women.

The app came to public attention last week after a report from Motherboardhighlighted its existence. Shortly afterward, the apps creator pulled it from the web, saying that the probability the software would be misused to harass and shame women was “too high.”

Of course, the app is still available, with numerous copies floating around forums and message boards. The Verge was able to find links that ostensibly offer downloads of DeepNude in a variety of places, including Telegram channels, message boards like 4chan, YouTube video descriptions, and even on the Microsoft-owned code repository GitHub.

The report from Motherboard found that the app was being sold on a Discord server (now removed) for $20. The anonymous sellers said they had improved the stability of the software, which was prone to crashing, and removed a feature that added watermarks to the fake images (supposedly to stop them from being used maliciously).

“We are happy to announce that we have the complete and clean version of DeepNude V2 and cracked the software and are making adjustments to improve the program,” wrote the sellers on Discord.

The individual who uploaded an open-source version of DeepNude to GitHub claimed they were annoyed that people were trying to “censor knowledge.” However, the uploader also included a screenshot of news coverage of the app from Vox and mocked concerns expressed in the article that the app could be harmful to women.

While The Verge was not able to test all of the links mentioned, we did verify that several copies of the software are being shared on forums, including a version that was tweaked to removal all watermarks. As with any modified free software, it is likely that some versions have been altered to include malware, so extreme caution is advised.

We noted in our original coverage of DeepNude that the nonconsensual nude images this software creates are often of dubious quality, and, indeed, many people sharing this software say theyre disappointed by its output. But while these images are easy to identify as fake, that doesnt necessarily minimize their threat or the impact they could have on peoples lives.

Since the term “deepfake” was coined, the technology has consistently been used to target women. People can use deepfakes to create pornographic and nu

Show More

Related Articles

Back to top button
Close