Apple’s new technology
to detect child pornography in users’ phones has raised concerns surrounding
surveillance and the potential for abuse of said technology by authoritarian
governments.

Apple’s technology
will search for matches of known child sexual abuse material (CSAM) before the
image is stored onto iCloud, according to a report by the BBC.

Also Read | What is Hungary’s new LGBT law and why is it causing such a stir?

So, what is Apple’s
new technology to stop child pornography and what are the concerns surrounding
it?

According to
Apple, new versions of the iOS and the iPadOS, which will be released later
this year, will have “new applications of “cryptography” to help limit the
spread of CSAM online, while designing for user privacy”.

But WhatsApp chief
Will Cathcart
has termed Apple’s move “very concerning”.

The
technology

Apple says that
its CSAM detection technology will use cryptography and the system will report
a match which will then be reviewed by a human. If child pornography is found
on a user’s device, Apple can then take steps to disable the user’s account and
inform law enforcement.

Concerns

WhatsApp head Will
Cathcart said that Apple’s new system “could very easily be used to scan
private content for anything they or a government decides it wants to control.
Countries where iPhones are sold will have different definitions on what is
acceptable”. 

Also Read | EURO 2020: German fan with rainbow flag invades pitch during Hungarian anthem

Cathcart noted
that WhatsApp’s system to tackle child sexual abuse material has yielded
results without breaking encryption and that more than 400,000 cases have been
reported to the US National Center for Missing and Exploited Children.

Digital rights
group Electronic Frontier Foundation has also criticised Apple’s new system and
has labelled it “a fully-built system just waiting for external pressure to
make the slightest change”.

Apple’s
position

Apple maintains that
its technology offers “significant” privacy benefits as it only learns about
users’ photos if they have a collection of known child sex abuse material in
their iCloud account.

Political
response

Apple’s technology
has largely found favour with politicians. UK Health Secretary Sajid Javid has
welcomed the move and has said that it is time for others, especially Facebook
to follow suit.

US Senator Richard
Blumenthal said it was a “welcome, innovative and bold step”. “This shows that
we can protect children and fundamental rights,” he added.