TECHNOLOGY

iOS 18.2 beta allows kids in Australia to report inappropriate content directly to Apple

×

iOS 18.2 beta allows kids in Australia to report inappropriate content directly to Apple

Share this article
iOS 18.2 beta allows kids in Australia to report inappropriate content directly to Apple


When iOS 17 dropped, Apple introduced default safety features that automatically flagged images and videos with nudity across iMessage, AirDrop, FaceTime, and Photos. Now, as part of the iOS 18.2 beta, there’s a new addition rolling out that lets kids in Australia report inappropriate content directly to Apple.

In iOS 18.2, the feature will be entirely optional

According to a new report, the updated feature will not only blur out any nude photos and videos in messaging chats but also give kids the power to flag those messages straight to Apple.Right now, Apple’s safety features on iPhones automatically spot any nude images and videos that kids might send or receive through iMessage, AirDrop, FaceTime, and Photos. This detection happens right on the device to keep things private.

With the current settings, if nudity is detected, kids see a couple of intervention popups explaining how to reach out to authorities and advising them to tell a parent or guardian. After the update, when the system identifies nudity, a new popup will pop up, allowing users to report the images and videos directly to Apple. From there, Apple could pass that info on to the authorities.

The device will generate a report that includes the images or videos in question, along with the messages sent right before and after the nudity was detected. It’ll also gather contact information from both accounts, and users will have the option to fill out a form detailing what went down.

See also  More iOS 17.5 Settings You Should Change

Once Apple gets the report, it’ll review the content. Based on that assessment, the company can take action on the account, which might involve disabling the user’s ability to send messages via iMessage and reporting the incident to law enforcement.

This feature is now being rolled out in Australia as part of the iOS 18.2 beta, with plans to go global later on. Plus, it’s expected to be optional for users.

The announcement’s timing and Australia being the first to get this feature align with new regulations set to take effect. By the end of 2024, tech companies will have to monitor child abuse and terror-related content on cloud and messaging services operating in Australia.

Apple had previously raised concerns that the draft code wouldn’t safeguard end-to-end encryption, putting everyone’s messages at risk of mass surveillance. In the end, the Australian eSafety commissioner softened the law, allowing companies to propose alternative solutions for addressing child abuse and terror content without sacrificing encryption.

Apple has drawn heavy criticism from regulators and law enforcement worldwide for its reluctance to adjust end-to-end encryption in iMessage for legal purposes. However, when it comes to child safety, I believe tougher measures are sometimes necessary, and laws like these aim to address that very issue.



Source Link Website

Leave a Reply

Your email address will not be published. Required fields are marked *