Apple Remains Silent About Plans to Detect Known CSAM Stored in iCloud Photos

It has now been over a year since Apple announced plans for three new child safety features, including a system to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos, an option to blur sexually explicit photos in the Messages app, and child exploitation resources for Siri. The latter two features are now available, but Apple remains silent about its plans for the CSAM detection feature.

iCloud General Feature
Apple initially said CSAM detection would be implemented in an update to iOS 15 and iPadOS 15 by the end of 2021, but the company ultimately postponed the feature based on "feedback from customers, advocacy groups, researchers, and others."

In September 2021, Apple posted the following update to its Child Safety page:

Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.

In December 2021, Apple removed the above update and all references to its CSAM detection plans from its Child Safety page, but an Apple spokesperson informed The Verge that Apple's plans for the feature had not changed. To the best of our knowledge, however, Apple has not publicly commented on the plans since that time.

We've reached out to Apple to ask if the feature is still planned. Apple did not immediately respond to a request for comment.

Apple did move forward with implementing its child safety features for the Messages app and Siri with the release of iOS 15.2 and other software updates in December 2021, and it expanded the Messages app feature to Australia, Canada, New Zealand, and the UK with iOS 15.5 and other software releases in May 2022.

Apple said its CSAM detection system was "designed with user privacy in mind." The system would perform "on-device matching using a database of known CSAM image hashes" from child safety organizations, which Apple would transform into an "unreadable set of hashes that is securely stored on users' devices."

Apple planned to report iCloud accounts with known CSAM image hashes to the National Center for Missing and Exploited Children (NCMEC), a non-profit organization that works in collaboration with U.S. law enforcement agencies. Apple said there would be a "threshold" that would ensure "less than a one in one trillion chance per year" of an account being incorrectly flagged by the system, plus a manual review of flagged accounts by a human.

Apple's plans were criticized by a wide range of individuals and organizations, including security researchers, the Electronic Frontier Foundation (EFF), politicians, policy groups, university researchers, and even some Apple employees.

Some critics argued that Apple's child safety features could create a "backdoor" into devices, which governments or law enforcement agencies could use to surveil users. Another concern was false positives, including the possibility of someone intentionally adding CSAM imagery to another person's iCloud account to get their account flagged.

Note: Due to the political or social nature of the discussion regarding this topic, the discussion thread is located in our Political News forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.

Popular Stories

iPhone SE 4 Thumb 1

iPhone SE 4 With Apple's Own 5G Modem 'Confirmed' to Launch in March

Tuesday November 19, 2024 12:12 pm PST by
Barclays analyst Tom O'Malley and his colleagues recently traveled to Asia to meet with various electronics manufacturers and suppliers. In a research note this week, outlining key takeaways from the trip, the analysts said they have "confirmed" that a fourth-generation iPhone SE with an Apple-designed 5G modem is slated to launch towards the end of the first quarter next year. In line with previo...
airtag purple

AirTag 2 Rumored to Launch Next Year With These New Features

Sunday November 17, 2024 5:18 am PST by
Apple released the AirTag in April 2021, so it is now three over and a half years old. While the AirTag has not received any hardware updates since then, a new version of the item tracking accessory is rumored to be in development. Below, we recap rumors about a second-generation AirTag. Timing Apple is aiming to release a new AirTag in mid-2025, according to Bloomberg's Mark Gurman....
Magic Mouse Next to Keyboard

No, Apple CEO Tim Cook Didn't Say He Prefers Logitech's MX Master 3 Over the Magic Mouse

Sunday November 17, 2024 3:03 pm PST by
While the Logitech MX Master 3 is a terrific mouse for the Mac, reports claiming that Apple CEO Tim Cook prefers that mouse over the Magic Mouse are false. The Wall Street Journal last month published an interview with Cook, in which he said he uses every Apple product every day. Soon after, The Verge's Wes Davis attempted to replicate using every Apple product in a single day. During that...
Generic iOS 18 Feature Real Mock

Apple Releases iOS 18.1.1 and iPadOS 18.1.1 With Security Fixes

Tuesday November 19, 2024 10:10 am PST by
Apple today released iOS 18.1.1 and iPadOS 18.1.1, minor updates to the iOS 18 and iPadOS 18 operating systems that debuted earlier in September. iOS 18.1.1 and iPadOS 18.1.1 come three weeks after the launch of iOS 18.1. The new software can be downloaded on eligible iPhones and iPads over-the-air by going to Settings > General > Software Update. Apple has also released iOS 17.7.2 for...
at t turbo indicator iphone 16 pro max v0 8hrh7w5f3w1e1

AT&T Turbo Indicator Showing Up in iPhone Status Bar for Subscribers

Wednesday November 20, 2024 3:42 am PST by
AT&T has begun displaying "Turbo" in the iPhone carrier label for customers subscribed to its premium network prioritization service, according to reports on Reddit. The new indicator seems to have started appearing after users updated to iOS 18.1.1, but that could be just coincidence. Image credit: Reddit user No_Highlight7476 The Turbo feature provides enhanced network performance through ...
iPhone 17 Slim Feature Single Camera 1 Redux

'iPhone 17 Air' Rumored to Surpass iPhone 6 as Thinnest iPhone Ever

Monday November 18, 2024 1:07 pm PST by
In a research note with Hong Kong-based investment bank Haitong today, obtained by MacRumors, Apple analyst Jeff Pu said he agrees with a recent rumor claiming that the so-called "iPhone 17 Air" will be around 6mm thick. "We agreed with the recent chatter of an 6mm thickness ultra-slim design of the iPhone 17 Slim model," he wrote. If that measurement proves to be accurate, there would be ...
bug security vulnerability issue fix larry

Make Sure to Update: iOS 18.1.1 and macOS Sequoia 15.1.1 Fix Actively Exploited Vulnerabilities

Tuesday November 19, 2024 10:52 am PST by
The iOS 18.1.1, iPadOS 18.1.1, and macOS Sequoia 15.1.1 updates that Apple released today address JavaScriptCore and WebKit vulnerabilities that Apple says have been actively exploited on some devices. With the JavaScriptCore vulnerability, processing maliciously crafted web content could lead to arbitrary code execution. The WebKit vulnerability had the same issue with maliciously crafted...

Top Rated Comments

xxray Avatar
30 months ago

Let’s hope this gets introduced. Harmful material and the individuals who share it could be held to account.

Those who think Apple will be spying on their photos need to learn how hashing works.
I guess you must be smarter than “security researchers ('https://www.macrumors.com/2021/08/05/security-researchers-alarmed-apple-csam-plans/'), the Electronic Frontier Foundation (EFF) ('https://www.macrumors.com/2021/08/06/snowden-eff-slam-plan-to-scan-messages-images/'), politicians ('https://www.macrumors.com/2021/08/18/german-politician-letter-tim-cook-csam-scanning/'), policy groups ('https://www.macrumors.com/2021/08/19/policy-groups-urge-apple-abandon-csam-scanning/'), university researchers ('https://www.macrumors.com/2021/08/20/university-researchers-csam-dangerous/'), and even some Apple employees ('https://www.macrumors.com/2021/08/13/apple-employees-concerns-over-csam/').”

You also must have missed this part of the article:


Some critics argued that Apple's child safety features could create a "backdoor" into devices, which governments or law enforcement agencies could use to surveil users. Another concern was false positives, including the possibility of someone intentionally adding CSAM imagery to another person's iCloud account to get their account flagged.
I’m all for protecting children and anyone in general from abuse, but invading the privacy of the entire rest of the population to do it isn’t the way to go. You don’t let a someone into your house to check for any illegal substances or content just because you might have it.
Score: 104 Votes (Like | Disagree)
baryon Avatar
30 months ago
Imagine if the Chinese government could one day use this system to check who has the Tank Man image in their cloud storage and deport that person to a muder camp the next day without question.

Apple can only enforce the local law. If the law is different in a different country, will it enforce that for its citizens? Say, everyone agrees that child abuse is bad. But what if in Russia, where homosexuality is pretty much a crime, anything labeled "LGBT propaganda aimed at minors" such as an informative book about an LGBT subject would be called "child abuse" for political reasons, and thus be illegal. Would Apple play international judge and pick and choose what it considers right and wrong based on its own morals, or would it strictly abide by the respective laws of each country, even if they go against Apple's initial "good intentions"? What happens when a government puts pressure on Apple to hand over control of this system to them "or else"? Will they do the right thing or will there come a point where money will matter more? (Hint: money eventually always takes priority over morals).

It sounds good but it gets messy the more questions you ask, which is not a good omen.
Score: 61 Votes (Like | Disagree)
Count Blah Avatar
30 months ago

Let’s hope this gets introduced. Harmful material and the individuals who share it could be held to account.

Those who think Apple will be spying on their photos need to learn how hashing works.
CSAM, tank man, Whinnie the Pooh, pro/anti-Trump(whichever side you find yourself), etc…

It’s not the ACTUAL subject, it’s the fact that they CAN and are eager to do it. I’m less inclined to be pissed when it’s iCloud, since it’s their storage. But Apple wanted to search our PERSONAL device. You know if they are scanning our devices, any despot can knock on Apple’s local office door, with many armed thugs and order the scanning of anything the despot desires. Apple has proven to bend over backwards to the CCP already, so it would only be a matter of time.

Screw that and anyone who supports on-device scanning.
Score: 46 Votes (Like | Disagree)
antiprotest Avatar
30 months ago

Well, this article is pretty much asking for trouble. It was the topic that almost broke Macrumors first time round.
Traffic on a web site is not trouble but $$$.
Score: 36 Votes (Like | Disagree)
Apple Knowledge Navigator Avatar
30 months ago
Well, this article is pretty much asking for trouble. It was the topic that almost broke Macrumors first time round.
Score: 35 Votes (Like | Disagree)
I7guy Avatar
30 months ago
Being silent probably means something is coming down the line.
Score: 34 Votes (Like | Disagree)