Apple's Proposed Phone-Scanning Child Safety Features 'Invasive, Ineffective, and Dangerous,' Say Cybersecurity Researchers in New Study

More than a dozen prominent cybersecurity experts hit out at Apple on Thursday for relying on "dangerous technology" in its controversial plan to detect child sexual abuse images on iPhones (via The New York Times).

Child Safety Feature Purple
The damning criticism came in a new 46-page study by researchers that looked at plans by Apple and the European Union to monitor people's phones for illicit material, and called the efforts ineffective and dangerous strategies that would embolden government surveillance.

Announced in August, the planned features include client-side (i.e. on-device) scanning of users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search.

According to the researchers, documents released by the European Union suggest that the bloc's governing body are seeking a similar program that would scan encrypted phones for both child sexual abuse as well as signs of organized crime and terrorist-related imagery.

"It should be a national-security priority to resist attempts to spy on and influence law-abiding citizens," said the researchers, who added they were publishing their findings now to inform the European Union of the dangers of its plan.

"The expansion of the surveillance powers of the state really is passing a red line," said Ross Anderson, a professor of security engineering at the University of Cambridge and a member of the group.

Aside from surveillance concerns, the researchers said, their findings indicated that the technology was not effective at identifying images of child sexual abuse. Within days of Apple's announcement, they said, people had pointed out ways to avoid detection by editing the images slightly.

"It's allowing scanning of a personal private device without any probable cause for anything illegitimate being done," added another member of the group, Susan Landau, a professor of cybersecurity and policy at Tufts University. "It's extraordinarily dangerous. It's dangerous for business, national security, for public safety and for privacy."

The cybersecurity researchers said they had begun their study before Apple's announcement, and were publishing their findings now to inform the European Union of the dangers of its own similar plans.

Apple has faced significant criticism from privacy advocates, security researchers, cryptography experts, academics, politicians, and even employees within the company for its decision to deploy the technology in a future update to iOS 15 and iPadOS 15.

Apple initially endeavored to dispel misunderstandings and reassure users by releasing detailed information, sharing FAQs, various new documents, interviews with company executives, and more in order to allay concerns.

However, when it became clear that this wasn't having the intended effect, Apple subsequently acknowledged the negative feedback and announced in September a delay to the rollout of the features to give the company time to make "improvements" to the CSAM system, although it's not clear what they would involve and how they would address concerns.

Apple has also said it would refuse demands by authoritarian governments to expand the image-detection system beyond pictures of children flagged by recognized databases of child sex abuse material, although it has not said that it would pull out of a market rather than obeying a court order.

Popular Stories

Generic iOS 19 Feature Mock Light

iOS 19 Leak Reveals All-New Design

Friday January 17, 2025 2:42 pm PST by
iOS 19 is still around six months away from being announced, but a new leak has allegedly revealed a completely redesigned Camera app. Based on footage it obtained, YouTube channel Front Page Tech shared a video showing what the new Camera app will apparently look like, with the key change being translucent menus for camera controls. Overall, the design of these menus looks similar to...
2024 iPhone Boxes Feature

Apple Changes Trade-In Values for iPhones, iPads, Macs, and More

Thursday January 16, 2025 6:45 am PST by
Apple today adjusted estimated trade-in values for select iPhone, iPad, Mac, and Apple Watch models in the U.S., according to its website. Some values increased, while others decreased. The changes were not too significant, with most values rising or dropping by $5 to $50. We have outlined some examples below: Device New Value Old Value iPhone 15 Pro Max Up to $630 U ...
2024 App Store Awards

Apple Explains Why It Removed TikTok From the App Store in the U.S.

Sunday January 19, 2025 6:58 am PST by
Apple on late Saturday removed TikTok from the App Store in the U.S., and it has now explained why it was required to take this action. Last year, the U.S. passed a law that required Chinese company ByteDance to divest its ownership of TikTok due to potential national security risks, or else the platform would be banned. That law went into effect today, and companies like Apple and Google...
Generic iOS 18

Everything New in iOS 18.3 Beta 3

Thursday January 16, 2025 12:39 pm PST by
Apple provided the third beta of iOS 18.3 to developers today, and while the betas have so far been light on new features, the third beta makes some major changes to Notification Summaries and also tweaks a few other features. Notification Summary Changes Apple made multiple changes to Notification Summaries in response to complaints about inaccurate summaries of news headlines. For...
iOS 19 Roundup Feature

iOS 19 Rumored to Be Compatible With These iPhones

Saturday January 18, 2025 10:28 am PST by
iOS 19 will not drop support for any iPhone models, according to French website iPhoneSoft.fr. The report cited a source who said iOS 19 will be compatible with any iPhone that can run iOS 18, which would mean the following models: iPhone 16 iPhone 16 Plus iPhone 16 Pro iPhone 16 Pro Max iPhone 15 iPhone 15 Plus iPhone 15 Pro iPhone 15 Pro Max iPhone 14 iPhon...
iPad Pro vs iPhone 17 Air Feature

Here's How Thin the iPhone 17 Air Might Be

Friday January 17, 2025 3:38 pm PST by
For the last several months, we've been hearing rumors about a redesigned version of the iPhone 17 that Apple might call the iPhone 17 "Air," or something along those lines. It's going to replace the iPhone 17 Plus as Apple's fourth iPhone option, and it will be offered alongside the iPhone 17, iPhone 17 Pro, and iPhone 17 Pro Max. We know the iPhone 17 Air is going to be super slim, but...
mail categories macos

Apple's Redesigned Mail App is Expanding to the Mac — Here's When

Sunday January 19, 2025 6:02 am PST by
Apple plans to expand the iPhone's redesigned Mail app to the Mac starting with macOS 15.4, according to Bloomberg's Mark Gurman. The first macOS 15.4 beta should be made available in the coming weeks, and Apple has previously suggested that the iOS 18.4, iPadOS 18.4, and macOS 15.4 series of software updates will be released to the public in April. The revamped Mail app debuted on all...
apple power beats pro 2

Powerbeats Pro 2 Coming Soon: Apple to Announce Them 'Imminently'

Sunday January 19, 2025 8:25 am PST by
In September, Apple said that it would be launching Powerbeats Pro 2 in 2025, and it appears the wireless earbuds are coming very soon. Powerbeats Pro 2 images found in iOS 18 code In his Power On newsletter today, Bloomberg's Mark Gurman said the Powerbeats Pro 2 are "due imminently." In addition to Apple filing the Powerbeats Pro 2 in regulatory databases last month, Gurman said Apple is...

Top Rated Comments

Wanted797 Avatar
43 months ago
Good.

If Apple want to promote themselves as Privacy focused. They deserve every bit of criticism for this ridiculous idea.

They brought it on themselves.
Score: 171 Votes (Like | Disagree)
_Spinn_ Avatar
43 months ago

Apple has also said it would refuse ('https://www.macrumors.com/2021/08/09/apple-faq-csam-detection-messages-scanning/') demands by authoritarian governments to expand the image-detection system beyond pictures of children flagged by recognized databases of child sex abuse material, although it has not said that it would pull out of a market rather than obeying a court order.
I just don’t see how Apple thinks this is even feasible. How do they expect to ignore the laws of a local government?

The whole idea of scanning content locally on someone’s phone is a terrible idea that will eventually be abused.
Score: 88 Votes (Like | Disagree)
MathersMahmood Avatar
43 months ago

Good.

If Apple want to promote themselves as Privacy focused. They deserve every bit of criticism for this ridiculous idea.

They brought it on themselves
This. 100% this.
Score: 63 Votes (Like | Disagree)
LV426 Avatar
43 months ago
“Apple has also said it would refuse ('https://www.macrumors.com/2021/08/09/apple-faq-csam-detection-messages-scanning/') demands by authoritarian governments to expand the image-detection system”

Apple cannot refuse such demands if they are written into a nation’s law, so this is a worthless promise. The UK government has the power (since 2016) to compel Apple – amongst others – to provide technical means of obtaining the information they want. But, worse than that, Apple are not permitted to divulge the fact that any such compulsion order has been made. They must, by law, keep those measures secret. It’s all very very Big Brother.
Score: 58 Votes (Like | Disagree)
iGobbleoff Avatar
43 months ago

lot of confusion/scare mongering here, they are not scanning phones. They are looking at photos that are on Apple servers. How can anyone object to this unless you have something to hide, why would you object when it helps children?
Because it’s the beginning of a slippery slope of scanning your device for anything else that some group can think of. Phones are now nothing but trackers for big tech and governments to abuse.
Score: 54 Votes (Like | Disagree)
H2SO4 Avatar
43 months ago

lot of confusion/scare mongering here, they are not scanning phones. They are looking at photos that are on Apple servers. How can anyone object to this unless you have something to hide, why would you object when it helps children?
Are you sure?;
Announced in August ('https://www.macrumors.com/2021/08/05/apple-new-child-safety-features/'), the planned features include client-side (i.e. on-device) scanning of users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search.
Score: 45 Votes (Like | Disagree)