Apple has recently announced that it will introduce some new software updates to prevent children from sexual abuse. The tech giant said that the ‘new updates will help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material (CSAM).’

The feature, which uses new technology to limit CASM is expected to go live in the United States first. It will also provide on-device protection for children from sending or receiving sensitive content, with mechanisms to alert parents in case the user is below the age of 13. 

Also Read: What is metaverse? Zuckerberg’s new vision for Facebook

What technology is Apple using to prevent CASM?

Apple, in a blog post, explained that the tech corp will use cryptography applications via iOS and iPadOS to match known CSAM images stored on iCloud Photo. 

“The technology will match images available on a user’s iCloud with known images provided by child safety organisations, to prioritise privacy, this operation is performed without actually seeing the image in question and only by looking for what is like a fingerprint match.

Apple’s technology will search for matches of known child sexual abuse material (CSAM) before the image is stored onto iCloud, according to a report by the BBC.

Also Read: All about WhatsApp’s ‘View Once’ photos and videos feature

The device will create a cryptographic safety voucher with the match result and additional encrypted data and saves it to iClouds with the image keeping the privacy of the user in mind

What happens in case of violation?

In case there are matches that are crossing the threshold, Apple will ‘report these instances to the National Center for Missing and Exploited Children (NCMEC)’ and an investigation will be taken forward.

Why is Apple introducing this feature?

The move comes as big-time tech companies have often been called out for not doing enough to prevent child abuse online and prevent the spread of Child Sexual Abuse Material (CSAM). This is Apple’s way of addressing this concern.

Why is this feature being criticised?

Ever since the announcement, many have criticised Apple’s new feature calling it a way to curb privacy by saying this is the kind of surveillance technology many governments would want to have and love to misuse.

What aggravates the criticism is that it comes from Apple, which is seen as an epitome of technology that sells itself on privacy factors and promises never to compromise on that.

Cryptography experts like Matthew Green of Johns Hopkins University have expressed fears that the system could be used to frame innocent people sending them images intended to trigger the matches for CSAM.