Photo courtesy of MacRumors
Apple is globally rolling out a feature that prevents children from viewing photos that contain nudity in its messaging app. The "communication safety in Messages" feature will blur said images and prompt notifications to guide the children about what to do next.
The opt-in feature from the Family Sharing system helps scan the images on-device, so it does not go against the end-to-end encryption of messages. After finding out that the images are inappropriate for the children, it will show a "You're not alone, and can always get help from someone you trust or with trained professionals" message on the screen. Users can also directly block the person from there.
Besides that, Apple has another two child safety features in the works as well. One of them would point users towards safety resources if they searched for topics relating to child sexual abuse, while another one prevents users from uploading child sexual abuse material (CSAM) to iCloud account. The latter has sparked controversies among privacy advocates by saying that it could introduce a backdoor that would undermine the security of Apple's users.
What do you think about these features? Leave a comment to let us know your thoughts, and stay tuned to TechNave.com for the latest tech report.