Newsletter

Apple Removes AI Apps from App Store for Creating Non-Consensual Nude Images

According to the latest report from “404 Media,” Apple recently removed several AI applications from the App Store that claimed to be capable of creating “non-consensual nude images.” These apps were carefully scrutinized and handled after they were discovered.

Inappropriate ads and app removal

Advertisement (Continue reading this article)

According to a report published Monday by 404 Media, some companies have been advertising on Instagram, claiming that their apps can “allow any girl to take off her clothes for free.” The ads also link directly to Apple’s App Store, where an app is described as an “art generator.”

However, Apple did not immediately respond to 404 Media’s initial request for comment. However, after the report was published, Apple proactively contacted “404 Media” to ask for more details. After “404 Media” provided specific advertising links and App Store pages, Apple immediately removed these applications from the App Store.

Processing results and future prospects

Advertisement (Continue reading this article)

The report highlighted that Apple had removed a total of three applications, but this was only done after “404 Media” provided specific links to these applications, revealing that Apple had not been able to discover the violating applications on its own. policy. Emanuel Maiberg of 404 Media said that there could still be a similar cat-and-mouse game in the future because Apple has difficulty finding such applications without outside suggestions.

This incident demonstrates that app store operators are starting to take more action against such apps to prevent the spread of inappropriate content and preserve users’ privacy and dignity.

NewMobileLife website: https://www.newmobilelife.com
Facebook: https://www.facebook.com/jetsoiphone

Advertisement (Continue reading this article)
#App #Store #Removes #Sets #NonConsensual #Nude #Image #Image #Generator #Apps