These apps were described by the App Store as “art generators” although what they really did was create “nonconsensual nude images.” Such images do not really show what these women really look like without clothes and rely on AI to create what appears to be the nude body under the clothing. Nonetheless, these pictures could still be used to embarrass or humiliate women and blackmail them.
404 Media said that Apple removed the apps but not before it gave the tech giant links to the apps and their ads. The report suggested that Apple could not seem to find the offending apps itself. While some of these apps offered the AI “undress” feature, others performed face swaps on adult images.
Apps removed from the App Store by Apple were advertising on Instagram and adult sites about their app’s ability to create porn
Some of these apps started appearing in the App Store as far back as 2022 and seemed innocent to both Apple and Google as the apps were listed in the App Store and Play Store respectively. Apparently unbeknown to the tech companies, the developers behind these apps were advertising their porn capabilities on adult sites. Instead of immediately taking down these apps, Apple and Google allowed them to remain in their app storefronts as long as they stopped advertising on porn sites.
Despite Apple and Google’s demand, one of the apps didn’t stop advertising on adult sites until this year when Google pulled the app from the Play Store.