Apple child abuse scans feature delayed over privacy concerns

Apple child abuse scans feature recently announced to protect children’s privacy and safety. But alongside the feature, people raised concerns of other privacy and security risk that lead to delay. Now, Apple delayed that feature to make some changes to satisfy the various connected privacy concerns.

What is the Apple child abuse scans feature, and how does it protects children’s privacy?

Apple’s new child abuse scans feature can scan every image on iPhone, iPad or Mac computers. Then it matches the known child exploited images database hashes to detect if any found. Once a certain number of matches detected for the same user, it triggers the alert. It could get investigated further in such a case and may report to the local law enforcement authorities.

This new child abuse scans feature suppose to come with Apple iOS 15, but Apple postponed the plan after criticism. Many privacy experts raised concerns that it may open the door for other privacy and security breaches. Also, it predicted that such a technology could harm citizens in terms of spying on people.

Apple child abuse scans feature delayed over child safety and privacy concerns

“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of child sexual abuse material”,  according to the statement by Apple.

Furthermore, Apple spoke person said, “Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features”.

What other measures Apple is taking about children’s privacy protection?

Further on, Apple would like to introduce a similar feature for iMessage and SMS scanning. With the help of such a technology known as NeuralHash, Apple aims to save children’s from such a problem. The accuracy level for scanning and matching the images hasn’t determined yet. But it will have high accuracy as National Center for Missing and Exploited Children managing the database which going to match with images.

Indeed, if matches are found for child abusive material, it will be reviewed manually by humans and further action taken if required. So basically, it passes from the AI technology beforehand to determine multiple privacy breaches.

However, Apple has to see at the other side of the coin and ensure the data is protected. Otherwise, it may harm general citizens if misused by the people with the wrong mindset. After all, everyone wants their data to be private, secure and hacker-proof in an advanced era of technology.

Final Thoughts:

Apple focusing more on user’s privacy nowadays and introduced lots of privacy features in recent days. The iOS 14 privacy features seems one of the significant updates brought by Apple. People using an iPhone are aware of audio and video usage indicators, data sharing, and other privacy features.

Using such an advanced scanning feature can help reduce the spread of abusive child material. And so, it decreases the case of child abuse in society. Currently, the child abuse scans feature going to activated for the United States only. But once successfully tested, you may see it in other countries too.

Overall, it will help society if used correctly; otherwise, privacy experts concerns are more logical. So Apple is working further to make private data more secure and prevents the misuse of it. Hopefully, you will see soon how Apple is helping children with such exploited material scanning technology.