In iOS 18.2, the feature will be entirely optional
According to a new report, the updated feature will not only blur out any nude photos and videos in messaging chats but also give kids the power to flag those messages straight to Apple.Right now, Apple’s safety features on iPhones automatically spot any nude images and videos that kids might send or receive through iMessage, AirDrop, FaceTime, and Photos. This detection happens right on the device to keep things private.
With the current settings, if nudity is detected, kids see a couple of intervention popups explaining how to reach out to authorities and advising them to tell a parent or guardian. After the update, when the system identifies nudity, a new popup will pop up, allowing users to report the images and videos directly to Apple. From there, Apple could pass that info on to the authorities.
A new popup will appear, allowing you to report the images and videos straight to Apple. | Image credit – Apple
The device will generate a report that includes the images or videos in question, along with the messages sent right before and after the nudity was detected. It’ll also gather contact information from both accounts, and users will have the option to fill out a form detailing what went down.
Once Apple gets the report, it’ll review the content. Based on that assessment, the company can take action on the account, which might involve disabling the user’s ability to send messages via iMessage and reporting the incident to law enforcement.
This feature is now being rolled out in Australia as part of the iOS 18.2 beta, with plans to go global later on. Plus, it’s expected to be optional for users.
Apple had previously raised concerns that the draft code wouldn’t safeguard end-to-end encryption, putting everyone’s messages at risk of mass surveillance. In the end, the Australian eSafety commissioner softened the law, allowing companies to propose alternative solutions for addressing child abuse and terror content without sacrificing encryption.
Apple has drawn heavy criticism from regulators and law enforcement worldwide for its reluctance to adjust end-to-end encryption in iMessage for legal purposes. However, when it comes to child safety, I believe tougher measures are sometimes necessary, and laws like these aim to address that very issue.