Apple is facing challenges with “dual use” apps that appear harmless but are utilized for creating deepfake porn, often at a high cost.
While Apple has stringent regulations for the App Store, preventing the entry of pornographic apps, there are certain limits to its control. Some apps offer features that can be easily exploited by users, and Apple may not even be aware of these abuses.
According to a report by 404 Media, Apple is struggling with a “dual use” issue related to apps offering face swapping features. While innocent on the surface, users are utilizing these features to swap faces onto pornographic content, including minors’ faces.
A reporter discovered a paid advertisement on Reddit for a face swap app, indicating the app’s business model required paid ad placement. The app allowed users to swap faces onto video from websites like Porn Hub. Although Apple prohibits porn-related apps, some apps evade this restriction by featuring user-generated content with inappropriate images and videos.
Upon being informed about the dual-use case of the advertised app, Apple took action to remove it. However, it appeared that Apple was unaware of the issue initially and had to be alerted about it.
This is not the first instance of seemingly innocuous apps passing the app review process and offering services that contradict Apple’s guidelines. While it may not be as obvious as converting a children’s app into a casino, the potential to create nonconsensual intimate imagery was evidently not on Apple’s radar.
Apps with artificial intelligence features can create highly realistic deep fakes, emphasizing the importance of companies like Apple addressing these issues proactively. While Apple cannot eliminate such misuse completely, implementing clear policies during app review could help, like specific guidelines and rules regarding pornographic image generation. Apple has already prevented deepfake AI websites from using sign-in with Apple.
For instance, apps should not be allowed to source video content from platforms like Porn Hub. Apple could also establish strict rules for potential dual-use apps, including zero-tolerance bans for those attempting to create inappropriate content.
Apple has taken steps to ensure that Apple Intelligence does not generate nude images, but this oversight should not be limited. As Apple claims to be the ultimate authority on the App Store, it needs to address issues like promoting nonconsensual intimate imagery in ads.
Face-swapping apps are not the only area of concern. Apps promoting infidelity, intimate video chats, adult chats, or similar content also manage to pass app review processes.
Reports have indicated flaws in the app review system, and regulators are becoming impatient with empty assurances. Apple must regain control of the App Store or risk losing its authority.
This article was first published at Source link . You can check them out for other stuffs