Apple Regrets About Its Automated Tool ‘iPhone Scanning’
Recently Apple revealed new image detection software to prevent child sexual abuse on the iPhone and iPad. This automated tool can alert Apple if any illegal images are uploaded to its iCloud storage.

Now Privacy groups are criticizing that Apple had created a security backdoor in its software. In this situation, the company regrets confusion over this scanning process. They announced that the software “jumbled pretty badly” and had been widely “misunderstood”.

Apple software chief Craig Federighi said, “We wish that this had come out a little more clearly for everyone,” He also said, “introducing two features at the same time was “a recipe for this kind of confusion”.

Apple announced two new tools that can be helpful to protect children from sexual abuse. These will be deployed in the US first. The first tool is a way of Image detection that can identify child sex abuse material (CSAM). Many Cloud service providers like Facebook, Google, and Microsoft, are using already using this technique to make sure people are not sharing CSAM.
The other one is message filtering. Through this iCloud tool, Apple is trying to ensure parental control of their children’s accounts. This machine-learning system will judge a photo and if find any nudity, it will warn the child.

However, Privacy groups are concerns about these technologies. They think that this can limit individual privacy and could be used government authority to spy on its citizens.