Apple's Proposed Phone-Scanning Child Safety Features 'Invasive, Ineffective, and Dangerous,' Say Cybersecurity Researchers in New Study

More than a dozen prominent cybersecurity experts hit out at Apple on Thursday for relying on "dangerous technology" in its controversial plan to detect child sexual abuse images on iPhones (via The New York Times).

Child Safety Feature Purple
The damning criticism came in a new 46-page study by researchers that looked at plans by Apple and the European Union to monitor people's phones for illicit material, and called the efforts ineffective and dangerous strategies that would embolden government surveillance.

Announced in August, the planned features include client-side (i.e. on-device) scanning of users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search.

According to the researchers, documents released by the European Union suggest that the bloc's governing body are seeking a similar program that would scan encrypted phones for both child sexual abuse as well as signs of organized crime and terrorist-related imagery.

"It should be a national-security priority to resist attempts to spy on and influence law-abiding citizens," said the researchers, who added they were publishing their findings now to inform the European Union of the dangers of its plan.

"The expansion of the surveillance powers of the state really is passing a red line," said Ross Anderson, a professor of security engineering at the University of Cambridge and a member of the group.

Aside from surveillance concerns, the researchers said, their findings indicated that the technology was not effective at identifying images of child sexual abuse. Within days of Apple's announcement, they said, people had pointed out ways to avoid detection by editing the images slightly.

"It's allowing scanning of a personal private device without any probable cause for anything illegitimate being done," added another member of the group, Susan Landau, a professor of cybersecurity and policy at Tufts University. "It's extraordinarily dangerous. It's dangerous for business, national security, for public safety and for privacy."

The cybersecurity researchers said they had begun their study before Apple's announcement, and were publishing their findings now to inform the European Union of the dangers of its own similar plans.

Apple has faced significant criticism from privacy advocates, security researchers, cryptography experts, academics, politicians, and even employees within the company for its decision to deploy the technology in a future update to iOS 15 and iPadOS 15.

Apple initially endeavored to dispel misunderstandings and reassure users by releasing detailed information, sharing FAQs, various new documents, interviews with company executives, and more in order to allay concerns.

However, when it became clear that this wasn't having the intended effect, Apple subsequently acknowledged the negative feedback and announced in September a delay to the rollout of the features to give the company time to make "improvements" to the CSAM system, although it's not clear what they would involve and how they would address concerns.

Apple has also said it would refuse demands by authoritarian governments to expand the image-detection system beyond pictures of children flagged by recognized databases of child sex abuse material, although it has not said that it would pull out of a market rather than obeying a court order.

Popular Stories

iPhone SE 4 Thumb 1

iPhone SE 4 With Apple's Own 5G Modem 'Confirmed' to Launch in March

Tuesday November 19, 2024 12:12 pm PST by
Barclays analyst Tom O'Malley and his colleagues recently traveled to Asia to meet with various electronics manufacturers and suppliers. In a research note this week, outlining key takeaways from the trip, the analysts said they have "confirmed" that a fourth-generation iPhone SE with an Apple-designed 5G modem is slated to launch towards the end of the first quarter next year. In line with previo...
airtag purple

AirTag 2 Rumored to Launch Next Year With These New Features

Sunday November 17, 2024 5:18 am PST by
Apple released the AirTag in April 2021, so it is now three over and a half years old. While the AirTag has not received any hardware updates since then, a new version of the item tracking accessory is rumored to be in development. Below, we recap rumors about a second-generation AirTag. Timing Apple is aiming to release a new AirTag in mid-2025, according to Bloomberg's Mark Gurman....
Magic Mouse Next to Keyboard

No, Apple CEO Tim Cook Didn't Say He Prefers Logitech's MX Master 3 Over the Magic Mouse

Sunday November 17, 2024 3:03 pm PST by
While the Logitech MX Master 3 is a terrific mouse for the Mac, reports claiming that Apple CEO Tim Cook prefers that mouse over the Magic Mouse are false. The Wall Street Journal last month published an interview with Cook, in which he said he uses every Apple product every day. Soon after, The Verge's Wes Davis attempted to replicate using every Apple product in a single day. During that...
Generic iOS 18 Feature Real Mock

Apple Releases iOS 18.1.1 and iPadOS 18.1.1 With Security Fixes

Tuesday November 19, 2024 10:10 am PST by
Apple today released iOS 18.1.1 and iPadOS 18.1.1, minor updates to the iOS 18 and iPadOS 18 operating systems that debuted earlier in September. iOS 18.1.1 and iPadOS 18.1.1 come three weeks after the launch of iOS 18.1. The new software can be downloaded on eligible iPhones and iPads over-the-air by going to Settings > General > Software Update. Apple has also released iOS 17.7.2 for...
at t turbo indicator iphone 16 pro max v0 8hrh7w5f3w1e1

AT&T Turbo Indicator Showing Up in iPhone Status Bar for Subscribers

Wednesday November 20, 2024 3:42 am PST by
AT&T has begun displaying "Turbo" in the iPhone carrier label for customers subscribed to its premium network prioritization service, according to reports on Reddit. The new indicator seems to have started appearing after users updated to iOS 18.1.1, but that could be just coincidence. Image credit: Reddit user No_Highlight7476 The Turbo feature provides enhanced network performance through ...
iPhone 17 Slim Feature Single Camera 1 Redux

'iPhone 17 Air' Rumored to Surpass iPhone 6 as Thinnest iPhone Ever

Monday November 18, 2024 1:07 pm PST by
In a research note with Hong Kong-based investment bank Haitong today, obtained by MacRumors, Apple analyst Jeff Pu said he agrees with a recent rumor claiming that the so-called "iPhone 17 Air" will be around 6mm thick. "We agreed with the recent chatter of an 6mm thickness ultra-slim design of the iPhone 17 Slim model," he wrote. If that measurement proves to be accurate, there would be ...
bug security vulnerability issue fix larry

Make Sure to Update: iOS 18.1.1 and macOS Sequoia 15.1.1 Fix Actively Exploited Vulnerabilities

Tuesday November 19, 2024 10:52 am PST by
The iOS 18.1.1, iPadOS 18.1.1, and macOS Sequoia 15.1.1 updates that Apple released today address JavaScriptCore and WebKit vulnerabilities that Apple says have been actively exploited on some devices. With the JavaScriptCore vulnerability, processing maliciously crafted web content could lead to arbitrary code execution. The WebKit vulnerability had the same issue with maliciously crafted...

Top Rated Comments

Wanted797 Avatar
41 months ago
Good.

If Apple want to promote themselves as Privacy focused. They deserve every bit of criticism for this ridiculous idea.

They brought it on themselves.
Score: 171 Votes (Like | Disagree)
_Spinn_ Avatar
41 months ago

Apple has also said it would refuse ('https://www.macrumors.com/2021/08/09/apple-faq-csam-detection-messages-scanning/') demands by authoritarian governments to expand the image-detection system beyond pictures of children flagged by recognized databases of child sex abuse material, although it has not said that it would pull out of a market rather than obeying a court order.
I just don’t see how Apple thinks this is even feasible. How do they expect to ignore the laws of a local government?

The whole idea of scanning content locally on someone’s phone is a terrible idea that will eventually be abused.
Score: 88 Votes (Like | Disagree)
MathersMahmood Avatar
41 months ago

Good.

If Apple want to promote themselves as Privacy focused. They deserve every bit of criticism for this ridiculous idea.

They brought it on themselves
This. 100% this.
Score: 63 Votes (Like | Disagree)
LV426 Avatar
41 months ago
“Apple has also said it would refuse ('https://www.macrumors.com/2021/08/09/apple-faq-csam-detection-messages-scanning/') demands by authoritarian governments to expand the image-detection system”

Apple cannot refuse such demands if they are written into a nation’s law, so this is a worthless promise. The UK government has the power (since 2016) to compel Apple – amongst others – to provide technical means of obtaining the information they want. But, worse than that, Apple are not permitted to divulge the fact that any such compulsion order has been made. They must, by law, keep those measures secret. It’s all very very Big Brother.
Score: 58 Votes (Like | Disagree)
iGobbleoff Avatar
41 months ago

lot of confusion/scare mongering here, they are not scanning phones. They are looking at photos that are on Apple servers. How can anyone object to this unless you have something to hide, why would you object when it helps children?
Because it’s the beginning of a slippery slope of scanning your device for anything else that some group can think of. Phones are now nothing but trackers for big tech and governments to abuse.
Score: 54 Votes (Like | Disagree)
H2SO4 Avatar
41 months ago

lot of confusion/scare mongering here, they are not scanning phones. They are looking at photos that are on Apple servers. How can anyone object to this unless you have something to hide, why would you object when it helps children?
Are you sure?;
Announced in August ('https://www.macrumors.com/2021/08/05/apple-new-child-safety-features/'), the planned features include client-side (i.e. on-device) scanning of users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search.
Score: 45 Votes (Like | Disagree)