Apple Remains Silent About Plans to Detect Known CSAM Stored in iCloud Photos

It has now been over a year since Apple announced plans for three new child safety features, including a system to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos, an option to blur sexually explicit photos in the Messages app, and child exploitation resources for Siri. The latter two features are now available, but Apple remains silent about its plans for the CSAM detection feature.

iCloud General Feature
Apple initially said CSAM detection would be implemented in an update to iOS 15 and iPadOS 15 by the end of 2021, but the company ultimately postponed the feature based on "feedback from customers, advocacy groups, researchers, and others."

In September 2021, Apple posted the following update to its Child Safety page:

Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.

In December 2021, Apple removed the above update and all references to its CSAM detection plans from its Child Safety page, but an Apple spokesperson informed The Verge that Apple's plans for the feature had not changed. To the best of our knowledge, however, Apple has not publicly commented on the plans since that time.

We've reached out to Apple to ask if the feature is still planned. Apple did not immediately respond to a request for comment.

Apple did move forward with implementing its child safety features for the Messages app and Siri with the release of iOS 15.2 and other software updates in December 2021, and it expanded the Messages app feature to Australia, Canada, New Zealand, and the UK with iOS 15.5 and other software releases in May 2022.

Apple said its CSAM detection system was "designed with user privacy in mind." The system would perform "on-device matching using a database of known CSAM image hashes" from child safety organizations, which Apple would transform into an "unreadable set of hashes that is securely stored on users' devices."

Apple planned to report iCloud accounts with known CSAM image hashes to the National Center for Missing and Exploited Children (NCMEC), a non-profit organization that works in collaboration with U.S. law enforcement agencies. Apple said there would be a "threshold" that would ensure "less than a one in one trillion chance per year" of an account being incorrectly flagged by the system, plus a manual review of flagged accounts by a human.

Apple's plans were criticized by a wide range of individuals and organizations, including security researchers, the Electronic Frontier Foundation (EFF), politicians, policy groups, university researchers, and even some Apple employees.

Some critics argued that Apple's child safety features could create a "backdoor" into devices, which governments or law enforcement agencies could use to surveil users. Another concern was false positives, including the possibility of someone intentionally adding CSAM imagery to another person's iCloud account to get their account flagged.

Note: Due to the political or social nature of the discussion regarding this topic, the discussion thread is located in our Political News forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.

Popular Stories

Generic iOS 19 Feature Mock Light

iOS 19 Leak Reveals All-New Design

Friday January 17, 2025 2:42 pm PST by
iOS 19 is still around six months away from being announced, but a new leak has allegedly revealed a completely redesigned Camera app. Based on footage it obtained, YouTube channel Front Page Tech shared a video showing what the new Camera app will apparently look like, with the key change being translucent menus for camera controls. Overall, the design of these menus looks similar to...
2024 App Store Awards

Apple Explains Why It Removed TikTok From the App Store in the U.S.

Sunday January 19, 2025 6:58 am PST by
Apple on late Saturday removed TikTok from the App Store in the U.S., and it has now explained why it was required to take this action. Last year, the U.S. passed a law that required Chinese company ByteDance to divest its ownership of TikTok due to potential national security risks, or else the platform would be banned. That law went into effect today, and companies like Apple and Google...
2024 iPhone Boxes Feature

Apple Changes Trade-In Values for iPhones, iPads, Macs, and More

Thursday January 16, 2025 6:45 am PST by
Apple today adjusted estimated trade-in values for select iPhone, iPad, Mac, and Apple Watch models in the U.S., according to its website. Some values increased, while others decreased. The changes were not too significant, with most values rising or dropping by $5 to $50. We have outlined some examples below: Device New Value Old Value iPhone 15 Pro Max Up to $630 U ...
Generic iOS 18

Everything New in iOS 18.3 Beta 3

Thursday January 16, 2025 12:39 pm PST by
Apple provided the third beta of iOS 18.3 to developers today, and while the betas have so far been light on new features, the third beta makes some major changes to Notification Summaries and also tweaks a few other features. Notification Summary Changes Apple made multiple changes to Notification Summaries in response to complaints about inaccurate summaries of news headlines. For...
iOS 19 Roundup Feature

iOS 19 Rumored to Be Compatible With These iPhones

Saturday January 18, 2025 10:28 am PST by
iOS 19 will not drop support for any iPhone models, according to French website iPhoneSoft.fr. The report cited a source who said iOS 19 will be compatible with any iPhone that can run iOS 18, which would mean the following models: iPhone 16 iPhone 16 Plus iPhone 16 Pro iPhone 16 Pro Max iPhone 15 iPhone 15 Plus iPhone 15 Pro iPhone 15 Pro Max iPhone 14 iPhon...
airtag 4 pack blue

AirTag 2 Launching This Year With These 3 New Features

Sunday January 19, 2025 8:11 am PST by
After a four-year wait, a new AirTag is finally expected to launch in 2025. Below, we recap rumored upgrades for the accessory. A few months ago, Bloomberg's Mark Gurman said Apple was aiming to release the AirTag 2 around the middle of 2025. While he did not offer a more specific timeframe, that means the AirTag 2 could be announced by the end of June. The original AirTag was announced...
iPad Pro vs iPhone 17 Air Feature

Here's How Thin the iPhone 17 Air Might Be

Friday January 17, 2025 3:38 pm PST by
For the last several months, we've been hearing rumors about a redesigned version of the iPhone 17 that Apple might call the iPhone 17 "Air," or something along those lines. It's going to replace the iPhone 17 Plus as Apple's fourth iPhone option, and it will be offered alongside the iPhone 17, iPhone 17 Pro, and iPhone 17 Pro Max. We know the iPhone 17 Air is going to be super slim, but...
apple power beats pro 2

Powerbeats Pro 2 Coming Soon: Apple to Announce Them 'Imminently'

Sunday January 19, 2025 8:25 am PST by
In September, Apple said that it would be launching Powerbeats Pro 2 in 2025, and it appears the wireless earbuds are coming very soon. Powerbeats Pro 2 images found in iOS 18 code In his Power On newsletter today, Bloomberg's Mark Gurman said the Powerbeats Pro 2 are "due imminently." In addition to Apple filing the Powerbeats Pro 2 in regulatory databases last month, Gurman said Apple is...

Top Rated Comments

xxray Avatar
32 months ago

Let’s hope this gets introduced. Harmful material and the individuals who share it could be held to account.

Those who think Apple will be spying on their photos need to learn how hashing works.
I guess you must be smarter than “security researchers ('https://www.macrumors.com/2021/08/05/security-researchers-alarmed-apple-csam-plans/'), the Electronic Frontier Foundation (EFF) ('https://www.macrumors.com/2021/08/06/snowden-eff-slam-plan-to-scan-messages-images/'), politicians ('https://www.macrumors.com/2021/08/18/german-politician-letter-tim-cook-csam-scanning/'), policy groups ('https://www.macrumors.com/2021/08/19/policy-groups-urge-apple-abandon-csam-scanning/'), university researchers ('https://www.macrumors.com/2021/08/20/university-researchers-csam-dangerous/'), and even some Apple employees ('https://www.macrumors.com/2021/08/13/apple-employees-concerns-over-csam/').”

You also must have missed this part of the article:


Some critics argued that Apple's child safety features could create a "backdoor" into devices, which governments or law enforcement agencies could use to surveil users. Another concern was false positives, including the possibility of someone intentionally adding CSAM imagery to another person's iCloud account to get their account flagged.
I’m all for protecting children and anyone in general from abuse, but invading the privacy of the entire rest of the population to do it isn’t the way to go. You don’t let a someone into your house to check for any illegal substances or content just because you might have it.
Score: 104 Votes (Like | Disagree)
baryon Avatar
32 months ago
Imagine if the Chinese government could one day use this system to check who has the Tank Man image in their cloud storage and deport that person to a muder camp the next day without question.

Apple can only enforce the local law. If the law is different in a different country, will it enforce that for its citizens? Say, everyone agrees that child abuse is bad. But what if in Russia, where homosexuality is pretty much a crime, anything labeled "LGBT propaganda aimed at minors" such as an informative book about an LGBT subject would be called "child abuse" for political reasons, and thus be illegal. Would Apple play international judge and pick and choose what it considers right and wrong based on its own morals, or would it strictly abide by the respective laws of each country, even if they go against Apple's initial "good intentions"? What happens when a government puts pressure on Apple to hand over control of this system to them "or else"? Will they do the right thing or will there come a point where money will matter more? (Hint: money eventually always takes priority over morals).

It sounds good but it gets messy the more questions you ask, which is not a good omen.
Score: 61 Votes (Like | Disagree)
Count Blah Avatar
32 months ago

Let’s hope this gets introduced. Harmful material and the individuals who share it could be held to account.

Those who think Apple will be spying on their photos need to learn how hashing works.
CSAM, tank man, Whinnie the Pooh, pro/anti-Trump(whichever side you find yourself), etc…

It’s not the ACTUAL subject, it’s the fact that they CAN and are eager to do it. I’m less inclined to be pissed when it’s iCloud, since it’s their storage. But Apple wanted to search our PERSONAL device. You know if they are scanning our devices, any despot can knock on Apple’s local office door, with many armed thugs and order the scanning of anything the despot desires. Apple has proven to bend over backwards to the CCP already, so it would only be a matter of time.

Screw that and anyone who supports on-device scanning.
Score: 46 Votes (Like | Disagree)
antiprotest Avatar
32 months ago

Well, this article is pretty much asking for trouble. It was the topic that almost broke Macrumors first time round.
Traffic on a web site is not trouble but $$$.
Score: 36 Votes (Like | Disagree)
Apple Knowledge Navigator Avatar
32 months ago
Well, this article is pretty much asking for trouble. It was the topic that almost broke Macrumors first time round.
Score: 35 Votes (Like | Disagree)
I7guy Avatar
32 months ago
Being silent probably means something is coming down the line.
Score: 34 Votes (Like | Disagree)