Apple's iOS 17 to warn users of unsolicited nudes with new Sensitive Content Warning feature

Apple’s iOS 17 to warn users of unsolicited nudes with new Sensitive Content Warning feature

The new feature will scan images and videos for nudity and warn users before they view them.

Apple initially unveiled its plans to address unsolicited nudes in 2021, which coincided with its proposal to flag photos uploaded to iCloud that contained known child sexual abuse material (CSAM). However, the company decided to abandon this particular plan by the end of 2022. Concerns arose regarding potential government pressures to scan for other types of images, as well as the risk of false positives. In contrast, the implementation of Communication Safety and Sensitive Content Warning does not encounter these challenges. Their sole purpose is to prevent individuals from traumatizing others.

Legislators have sought to criminalize the dissemination of unwanted nude content, and various services have implemented their own anti-nude detection tools. Apple’s efforts primarily revolve around filling gaps in the existing deterrence system. The aim is to ensure that dubious individuals are less likely to succeed in bombarding iPhone users with inappropriate texts and calls.

By integrating the Sensitive Content Warning feature, iOS 17 empowers users to have more control over their content and protect themselves from unwanted and potentially distressing experiences. The option to decline or view sensitive content, accompanied by resources for assistance, provides users with the ability to make informed decisions and seek help when needed.