icon bookmark-bicon bookmarkicon cameraicon checkicon chevron downicon chevron lefticon chevron righticon chevron upicon closeicon v-compressicon downloadicon editicon v-expandicon fbicon fileicon filtericon flag ruicon full chevron downicon full chevron lefticon full chevron righticon full chevron upicon gpicon insicon mailicon moveicon-musicicon mutedicon nomutedicon okicon v-pauseicon v-playicon searchicon shareicon sign inicon sign upicon stepbackicon stepforicon swipe downicon tagicon tagsicon tgicon trashicon twicon vkicon yticon wticon fm
10 Aug, 2021 14:21

Apple’s new anti-child abuse tech is a real threat to privacy and the authority of human judgment

Apple’s new anti-child abuse tech is a real threat to privacy and the authority of human judgment

In the name of combatting child abuse, Apple are not only on the slippery slope of undermining millions of users’ privacy, but also building an avalanche that threatens to displace human judgment with machine-learning algorithms.

The road to hell has always been paved with good intentions.

And so it is with Apple’s new detection tool, called NeuralHash, which can identify images of child abuse. The impending change to its operating system is a backdoor into its data storage system and its messaging system.

Apple’s compromise on end-to-end encryption is a shocking about-face for users who have relied on the company’s leadership in privacy and security.

Also on rt.com Upcoming ‘Apple Watch for kids’ is a bad sign of creeping tech addiction

In 2019, the company ran an aggressive, combative ad campaign promising its users that “what happens on your iPhone stays on your iPhone.”

But when last week Apple announced new technologies to detect child sexual abuse material (CSAM) right on iPhones, what’s on your iPhone no longer always simply stays there.

There are two main features that the company is planning to install in every Apple device. A scanning feature will scan all photos uploaded into iCloud Photos to see if they match a shot in the database of known child sexual abuse material maintained by the National Center for Missing & Exploited Children (NCMEC).

The other feature scans all iMessage images sent or received by child accounts – that is, accounts designated as owned by a minor – for sexually explicit material. If the child is young enough, Apple will notify the parent when these images are sent or received. This feature can be turned on or off by parents. Initially, this will be rolled out in the US only, but there are plans to extend it elsewhere.

There are very concerning implications of Apple’s departure from end-to-end encryption, which, despite being introduced in the name of fighting child abuse, ought to be closely interrogated.

In the first instance, the threat to future privacy is a real and present danger. The Electronic Frontier Foundation has rightly pointed out that it’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children. The problem is, the narrow backdoor that Apple is building could quickly become a chasm. As they say, it is easy to expand “the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts.”

Also on rt.com Social media is not as big a threat to humanity as the self-selected experts who think they have a right to police it

This is not a slippery slope; it’s an avalanche, a fully built system just waiting for external pressure to make the slightest change.

As day follows night, pressure will mount for Apple to train its classifiers to restrict any content deemed inappropriate by any government – from LGBTQ+ to popular satirical images or protest flyers. The possible impact on free speech is enormous.

This is not a hysterical overreaction or mythical ‘what if’ scenario. It has already happened: one of the technologies built initially to scan and hash child sexual abuse imagery has been repurposed to create a database of “terrorist” content that companies can contribute to and access to ban such content.

But how long will it be before Apple makes concessions to autocratic governments to extend their capability to cover other material?

Commercial considerations have forced Apple to make considerable concessions to governments in the past to continue operating in their countries. It sells iPhones without FaceTime in countries that don’t allow encrypted phone calls, and in China, it removed thousands of apps from its App Store and moved to store user data on the servers of a state-run telecom.

There is a second area of concern that is hardly being discussed. This is the use of machine-learning classifiers, algorithms to determine what constitutes a sexually explicit image.

These filters are a blunt instrument that, as already mentioned, have proven to be notoriously unreliable in the past. They are black boxes beyond real accountability. Apple implicitly recognises this. They have been quick to point out that any images of abuse will be reviewed by humans when multiple images are detected.

What is not being discussed is how Apple and its algorithms will insert themselves into the private lives of millions of users through this capability, how it will insert itself into the trust relationships between parents and their children.

If an explicit picture is detected, children under 13 will be asked if they wish to send it, warning that their parents will be alerted if they choose to go ahead. The same system will alert parents to their children’s online behaviour.

This is explicitly altering the balance between human judgement and machine or algorithmic determination. Parental judgment and the foundations of trusting relationships are now in danger of being outsourced to machines operated by a commercial juggernaut accountable only to its shareholders, not to you or me.

We don’t need Apple or its technology to flag the abuse this represents to the future of privacy and freedom.

Think your friends would be interested? Share this story!

The statements, views and opinions expressed in this column are solely those of the author and do not necessarily represent those of RT.

Podcasts
0:00
27:26
0:00
27:2