Apple's Proposed Phone-Scanning Child Safety Features 'Invasive, Ineffective, and Dangerous,' Say Cybersecurity Researchers in New Study

More than a dozen prominent cybersecurity experts hit out at Apple on Thursday for relying on "dangerous technology" in its controversial plan to detect child sexual abuse images on iPhones (via The New York Times).

Child Safety Feature Purple
The damning criticism came in a new 46-page study by researchers that looked at plans by Apple and the European Union to monitor people's phones for illicit material, and called the efforts ineffective and dangerous strategies that would embolden government surveillance.

Announced in August, the planned features include client-side (i.e. on-device) scanning of users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search.

According to the researchers, documents released by the European Union suggest that the bloc's governing body are seeking a similar program that would scan encrypted phones for both child sexual abuse as well as signs of organized crime and terrorist-related imagery.

"It should be a national-security priority to resist attempts to spy on and influence law-abiding citizens," said the researchers, who added they were publishing their findings now to inform the European Union of the dangers of its plan.

"The expansion of the surveillance powers of the state really is passing a red line," said Ross Anderson, a professor of security engineering at the University of Cambridge and a member of the group.

Aside from surveillance concerns, the researchers said, their findings indicated that the technology was not effective at identifying images of child sexual abuse. Within days of Apple's announcement, they said, people had pointed out ways to avoid detection by editing the images slightly.

"It's allowing scanning of a personal private device without any probable cause for anything illegitimate being done," added another member of the group, Susan Landau, a professor of cybersecurity and policy at Tufts University. "It's extraordinarily dangerous. It's dangerous for business, national security, for public safety and for privacy."

The cybersecurity researchers said they had begun their study before Apple's announcement, and were publishing their findings now to inform the European Union of the dangers of its own similar plans.

Apple has faced significant criticism from privacy advocates, security researchers, cryptography experts, academics, politicians, and even employees within the company for its decision to deploy the technology in a future update to iOS 15 and iPadOS 15.

Apple initially endeavored to dispel misunderstandings and reassure users by releasing detailed information, sharing FAQs, various new documents, interviews with company executives, and more in order to allay concerns.

However, when it became clear that this wasn't having the intended effect, Apple subsequently acknowledged the negative feedback and announced in September a delay to the rollout of the features to give the company time to make "improvements" to the CSAM system, although it's not clear what they would involve and how they would address concerns.

Apple has also said it would refuse demands by authoritarian governments to expand the image-detection system beyond pictures of children flagged by recognized databases of child sex abuse material, although it has not said that it would pull out of a market rather than obeying a court order.

Popular Stories

Apple iPhone 16e Feature

Apple Announces iPhone 16e With A18 Chip and Apple Intelligence, Pricing Starts at $599

Wednesday February 19, 2025 8:02 am PST by
Apple today introduced the iPhone 16e, its newest entry-level smartphone. The device succeeds the third-generation iPhone SE, which has now been discontinued. The iPhone 16e features a larger 6.1-inch OLED display, up from a 4.7-inch LCD on the iPhone SE. The display has a notch for Face ID, and this means that Apple no longer sells any iPhones with a Touch ID fingerprint button, marking the ...
iphone 17 pro asherdipps

iPhone 17 Pro Models Rumored to Feature Aluminum Frame Instead of Titanium Frame

Tuesday February 18, 2025 12:02 pm PST by
Over the years, Apple has switched from an aluminum frame to a stainless steel frame to a titanium frame for its highest-end iPhones. And now, it has been rumored that Apple will go back to using aluminum for three out of four iPhone 17 models. In an investor note with research firm GF Securities, obtained by MacRumors this week, Apple supply chain analyst Jeff Pu said the iPhone 17, iPhone...
apple launch feb 2025 alt

Here Are the New Apple Products We're Still Expecting This Spring

Thursday February 20, 2025 5:06 am PST by
Now that Apple has announced its new more affordable iPhone 16e, our thoughts turn to what else we are expecting from the company this spring. There are three product categories that we are definitely expecting to get upgraded before spring has ended. Keep reading to learn what they are. If we're lucky, Apple might make a surprise announcement about a completely new product category. M4...
Generic iOS 18

Here's When Apple Will Release iOS 18.4

Wednesday February 19, 2025 11:38 am PST by
Following the launch of the iPhone 16e, Apple updated its iOS 18, iPadOS 18, and macOS Sequoia pages to give a narrower timeline on when the next updates are set to launch. All three pages now state that new Apple Intelligence features and languages will launch in early April, an update from the more broader April timeframe that Apple provided before. The next major point updates will be iOS ...
apple launch feb 2025

Tim Cook Teases an 'Apple Launch' Next Wednesday

Thursday February 13, 2025 8:07 am PST by
In a social media post today, Apple CEO Tim Cook teased an upcoming "launch" of some kind scheduled for Wednesday, February 19. "Get ready to meet the newest member of the family," he said, with an #AppleLaunch hashtag. The post includes a short video with an animated Apple logo inside a circle. Cook did not provide an exact time for the launch, or share any other specific details, so...
apple c1

Apple Unveils 'C1' as First Custom Cellular Modem

Wednesday February 19, 2025 8:08 am PST by
Apple today announced its first custom cellular modem with the name "C1," debuting in the all-new iPhone 16e. The new modem contributes to the iPhone 16e's power efficiency, giving it the longest battery life of any iPhone with a 6.1-inch display, such as the iPhone 15 and iPhone 16. Expanding the benefits of Apple silicon, C1 is the first modem designed by Apple and the most...
Apple Northbrook

Apple Store Permanently Closing at Struggling Mall in Chicago Area

Tuesday February 18, 2025 8:46 pm PST by
Apple is permanently closing its retail store at the Northbrook Court shopping mall in the Chicago area. The company confirmed the upcoming closure today in a statement, but it has yet to provide a closing date for the location. Apple Northbrook opened in 2005, and the store moved to a larger space in the mall in 2017. Apple confirmed that affected employees will continue to work for the...

Top Rated Comments

Wanted797 Avatar
44 months ago
Good.

If Apple want to promote themselves as Privacy focused. They deserve every bit of criticism for this ridiculous idea.

They brought it on themselves.
Score: 171 Votes (Like | Disagree)
_Spinn_ Avatar
44 months ago

Apple has also said it would refuse ('https://www.macrumors.com/2021/08/09/apple-faq-csam-detection-messages-scanning/') demands by authoritarian governments to expand the image-detection system beyond pictures of children flagged by recognized databases of child sex abuse material, although it has not said that it would pull out of a market rather than obeying a court order.
I just don’t see how Apple thinks this is even feasible. How do they expect to ignore the laws of a local government?

The whole idea of scanning content locally on someone’s phone is a terrible idea that will eventually be abused.
Score: 88 Votes (Like | Disagree)
MathersMahmood Avatar
44 months ago

Good.

If Apple want to promote themselves as Privacy focused. They deserve every bit of criticism for this ridiculous idea.

They brought it on themselves
This. 100% this.
Score: 63 Votes (Like | Disagree)
LV426 Avatar
44 months ago
“Apple has also said it would refuse ('https://www.macrumors.com/2021/08/09/apple-faq-csam-detection-messages-scanning/') demands by authoritarian governments to expand the image-detection system”

Apple cannot refuse such demands if they are written into a nation’s law, so this is a worthless promise. The UK government has the power (since 2016) to compel Apple – amongst others – to provide technical means of obtaining the information they want. But, worse than that, Apple are not permitted to divulge the fact that any such compulsion order has been made. They must, by law, keep those measures secret. It’s all very very Big Brother.
Score: 58 Votes (Like | Disagree)
iGobbleoff Avatar
44 months ago

lot of confusion/scare mongering here, they are not scanning phones. They are looking at photos that are on Apple servers. How can anyone object to this unless you have something to hide, why would you object when it helps children?
Because it’s the beginning of a slippery slope of scanning your device for anything else that some group can think of. Phones are now nothing but trackers for big tech and governments to abuse.
Score: 54 Votes (Like | Disagree)
H2SO4 Avatar
44 months ago

lot of confusion/scare mongering here, they are not scanning phones. They are looking at photos that are on Apple servers. How can anyone object to this unless you have something to hide, why would you object when it helps children?
Are you sure?;
Announced in August ('https://www.macrumors.com/2021/08/05/apple-new-child-safety-features/'), the planned features include client-side (i.e. on-device) scanning of users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search.
Score: 45 Votes (Like | Disagree)