Apple removes AI-generated nude image apps from its App Store following concerns of privacy violation.
Apple takes down AI-powered nude image generating apps from its App Store.
Apple has removed several apps from its App Store that used artificial intelligence (AI) to create nude images of people, according to a report by 404 Media. These apps were a concern because they could be used to generate nude photos of someone without their permission.
Here’s how it worked: Ads for these apps were found on Instagram, and some of them linked to downloadable apps available on the App Store. These apps offered different features, like swapping faces onto adult photos or using AI to digitally remove clothing from regular pictures.
The report raises concerns about Apple’s ability to detect such apps on its own. Apparently, Apple only took action after 404 Media reported the apps and their advertisements. This suggests that Apple might need to improve its methods for finding apps that violate its App Store guidelines.
This isn’t the first time tech companies have faced issues with AI-powered apps that can create fake or misleading content. In 2022, similar apps were found on both the Apple App Store and Google Play Store. Back then, the companies requested that the app developers stop advertising these features on adult websites, but the apps themselves weren’t removed.
The spread of these “undressing apps” is particularly concerning because they’ve been found in schools and colleges around the world. These apps can be a privacy violation and a form of harassment.
By taking down these apps, Apple is sending a message that it won’t tolerate apps that can be used to create non-consensual nude images. However, the report also highlights the need for Apple to improve its app review process to prevent such apps from appearing on the App Store in the first place.