Apple Removes 3 AI Image Generators from App Store for Advertising Non-consensual Nudes
- Following an investigation by 404 Media, Apple has removed 3 apps that generated sexually explicit images of users without their consent.
- The worst part is, these apps advertised their services on Meta platforms, and yet no one noticed until a third party conducted an investigation.
- Neither Apple nor Meta have made any comments on the matter.
Following an investigation by 404 Media, Apple has removed at least three apps that were capable of creating non-consensual sexually explicit images using artificial intelligence.
The first report of the investigation was posted on April 22 which revealed that Meta and its social media app Instagram were allowing ads of apps that claim to generate non-consensual nudes. The ads allowed the companies to rake in a decent amount of business.
Then another follow-up report was published that confirmed Apple's removal of the above-mentioned apps from the App Store.
However, the problem is that it only took action after 404 Media (a third party) started the investigation and supplied it with the links to those apps. This goes on to show that Apple might not be capable of independently recognizing and tracking down harmful apps.
Why Is This Such a Big Concern?Any app that gets listed on Apple's app store has to go through the screening process first. And the company has a robust set of rules for incoming apps-or so we thought.
The first one of its guidelines is on Safety' where it talks about how the app should be safe to use and not cause any harm to others so that Apple users can feel confident using it. However, it's now clear that rules mean little when it's not backed by execution.
These apps were listed under art generation' which helped them bypass Apple's restrictions on adult apps. This goes on to show that Apple's vetting process might not be very effective.In Apple's defense, though, there might be two major reasons why the apps might have slipped its radar.
- First, without coming across the social media ads and without actually using the apps, it's impossible to know what they really offer-that's how well-disguised they are.
- Second, while the Apple staff tests some apps before listing them, it's impractical to test all of them given the platform has close to 2 million apps on it.
With that said, I do feel it's important for Apple to come up with a better vetting process that's practical and 100% effective. This is particularly important given the advent of AI and how easy it is to misuse the technology-we're already seeing how dangerous and unhinged deepfakes can be.
Read more: White House alarmed over Taylor Swift deepfakes, calls for new legislation
Understanding Meta's Fault and Recurring MistakesNeedless to say, Apple isn't the only one to be blamed in this instance. Meta deserves a sizable chunk of the blame-if not the majority of it. After all, it was the one that allowed those ads to run wild on the internet in the first place.
What's shocking is that the ads clearly revealed the pornographic capabilities of the apps. How Meta approved and continued to serve them to its users remains a big question.Even worse, this isn't the first time Meta has been caught serving sexually explicit ads. In one instance, a sexually explicit image of a famous Indian woman was published on an Instagram account that allegedly specialized' in posting AI nudes of Indian women.
A report against the post was automatically dismissed twice. It was only when the user reported the incident to the Oversight Board that it was taken down for breaching the Bullying and Harassment Community Standards.
In another incident, an AI app called Perky AI ran advertisements using pornographic images of a popular 16-year-old actress called Jenna Ortega-you might remember her from the widely popular Wednesday' TV series on Netflix. Furthermore, the same app was also listed on the Apple App Store.
Addressing the above-mentioned incident, Meta released a statement that said:
Meta strictly prohibits child nudity, content that sexualizes children, and services offering AI-generated non-consensual nude images."
The Bottom LineEven though Meta has taken quite a few steps for user safety of late, including reinforcing safety measures for teens in January and labeling AI-generated images on Facebook/Instagram in February, it looks like it's unwilling to do that if it means hampering its advertisement business.
Important Note: Meta's advertising revenue was a whopping $132 billion in 2023, according to Statista.For the issue at hand, though, we're eagerly waiting for a detailed explanation by both Meta and Apple. Stay tuned for more.
The post Apple Removes 3 AI Image Generators from App Store for Advertising Non-consensual Nudes appeared first on The Tech Report.