Apple Pulls 3 Generative AI Apps Being Used to Make Deepfake Nudes


Apple has removed several generative AI apps from the App Store after a 404 Media investigation discovered they could be used to create nonconsensual nude images.

Apple removed three apps from the App Store that the site identified, but it reportedly took some time for the company to act. 404 Media had to provide Apple with direct App Store links, “indicating the company was not able to find the apps that violated its policy itself,” it said.

Developers are reportedly promoting these apps via Instagram ads with taglines like “undress any girl for free” and “any clothing delete.”

Deepfake porn images and videos have proliferated on social media, ensnaring unsuspecting teens and public figures, including Taylor Swift and Rep. Alexandria Ocasio-Cortez. In the House, Rep. Ocasio-Cortez introduced the Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act, which would create a “federal civil remedy” for deepfake victims. A companion bill as also introduced in the Senate, but it has thus far not seen any movement.

The App Store isn’t the only platform grappling with deepfake porn. The Taylor Swift images were distributed on X, and Meta’s Oversight Board has taken the company to task for having an inconsistent enforcement policy regarding AI-generated deepfakes of real women.

Recommended by Our Editors

The UK’s Ministry of Justice recently announced plans to make the creation of sexually explicit deepfakes like those that could be created by the removed apps illegal in the country.

Like What You’re Reading?

Sign up for SecurityWatch newsletter for our top privacy and security stories delivered right to your inbox.

This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *