This is why Apple is delaying its child abuse photo scanning software

Apple child protection software

Apple child protection software

This is why Apple is delaying its child abuse photo scanning software

Apple has a brand new child abuse photo scanning software. However, despite appreciation from child protection agencies, industry peers and advocates of digital privacy raised red flags. Read to know all about why Apple is delaying the release of the software.

What is Apple’s child abuse photo scanning software?

Last month, the tech giant announced the release of a two-pronged tool that can scan for Child Sexual Abuse Material (CSAM). Apple’s neuralMatch tool reviews photos before uploading them to iCloud. Additionally, it also analyzes the contents of iMessages and warns users. “The Messages app will use on-device machine learning to warn about sensitive content while keeping private communications unreadable by Apple,” stated Apple.

What is neuralMatch?

Apple tool compares pictures on your device with a database of child abuse images. When a flag arises, Apple’s staff manually reviews them. However, if it confirms child abuse, the US National Center for Missing and Exploited Children (NCMEC) will be notified. Additionally, if the flagging was a mistake, people have an option to undo it.

What are the concerns against the software?

Industry experts and advocates of digital privacy believe that it is not possible to make a scanning system from the client site to track sexually explicit images received or sent by children, without tweaking for other uses. Additionally, the release announcement puts law enforcement and government authorities in the spotlight for seeking a backdoor to encryption.

“This is an Apple-built and operated surveillance system that can very easily be used to scan private content for anything they or a government decides it wants to control. Countries, where iPhones are sold, will have different definitions on what is acceptable,” said Will Cathcart. Cathcart is the head of the end-to-end encryption messaging service at WhatsApp.

Why is Apple backtracking the release of the software?

Apple is now promising to take more time to collect feedback from users and improve its safety features. The firm is now playing defensive on the release of the software.“Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” stated Apple. Additionally, it has offered several explanations and documents to prove that the risk of false detection is very low.

Exit mobile version