Craig Federighi Acknowledges Confusion Around Apple Child Safety Features and Explains New Details About Safeguards

Apple's senior vice president of software engineering, Craig Federighi, has today defended the company's controversial planned child safety features in a significant interview with The Wall Street Journal, revealing a number of new details about the safeguards built into Apple's system for scanning users' photos libraries for Child Sexual Abuse Material (CSAM).

craig wwdc 2021 privacy
Federighi admitted that Apple had handled last week's announcement of the two new features poorly, relating to detecting explicit content in Messages for children and CSAM content stored in iCloud Photos libraries, and acknowledged the widespread confusion around the tools:

It's really clear a lot of messages got jumbled pretty badly in terms of how things were understood. We wish that this would've come out a little more clearly for everyone because we feel very positive and strongly about what we're doing.

[...]

In hindsight, introducing these two features at the same time was a recipe for this kind of confusion. By releasing them at the same time, people technically connected them and got very scared: what's happening with my messages? The answer is...nothing is happening with your messages.

The Communications Safety feature means that if children send or receive explicit images via iMessage, they will be warned before viewing it, the image will be blurred, and there will be an option for their parents to be alerted. CSAM scanning, on the other hand, attempts to match users' photos with hashed images of known CSAM before they are uploaded to iCloud. Accounts that have had CSAM detected will then be subject to a manual review by Apple and may be reported to the National Center for Missing and Exploited Children (NCMEC).

The new features have been subject to a large amount of criticism from users, security researchers, the Electronic Frontier Foundation (EFF) and Edward Snowden, Facebook's former security chief, and even Apple employees.

Amid these criticisms, Federighi addressed one of the main areas of concern, emphasizing that Apple's system will be protected against being taken advantage of by governments or other third parties with "multiple levels of auditability."


Federighi also revealed a number of new details around the system's safeguards, such as the fact that a user will need to meet around 30 matches for CSAM content in their Photos library before Apple is alerted, whereupon it will confirm if those images appear to be genuine instances of CSAM.

If and only if you meet a threshold of something on the order of 30 known child pornographic images matching, only then does Apple know anything about your account and know anything about those images, and at that point, only knows about those images, not about any of your other images. This isn't doing some analysis for did you have a picture of your child in the bathtub? Or, for that matter, did you have a picture of some pornography of any other sort? This is literally only matching on the exact fingerprints of specific known child pornographic images.

He also pointed out the security advantage of placing the matching process on the iPhone directly, rather than it occurring on ‌iCloud‌'s servers.

Because it's on the [phone], security researchers are constantly able to introspect what’s happening in Apple’s [phone] software. So if any changes were made that were to expand the scope of this in some way —in a way that we had committed to not doing—there's verifiability, they can spot that that's happening.

When asked if the database of images used to match CSAM content on users' devices could be compromised by having other materials inserted, such as political content in certain regions, Federighi explained that the database is constructed from known CSAM images from multiple child safety organizations, with at least two being "in distinct jurisdictions," to protect against abuse of the system.

These child protection organizations, as well as an independent auditor, will be able to verify that the database of images only consists of content from those entities, according to Federighi.

Federighi's interview is among the biggest PR pushbacks from Apple so far following the mixed public response to the announcement of the child safety features, but the company has also repeatedly attempted to address users' concerns, publishing an FAQ and directly addressing concerns in interviews with the media.

Popular Stories

Apple iPhone 16e Feature

Apple Announces iPhone 16e With A18 Chip and Apple Intelligence, Pricing Starts at $599

Wednesday February 19, 2025 8:02 am PST by
Apple today introduced the iPhone 16e, its newest entry-level smartphone. The device succeeds the third-generation iPhone SE, which has now been discontinued. The iPhone 16e features a larger 6.1-inch OLED display, up from a 4.7-inch LCD on the iPhone SE. The display has a notch for Face ID, and this means that Apple no longer sells any iPhones with a Touch ID fingerprint button, marking the ...
iphone 17 pro asherdipps

iPhone 17 Pro Models Rumored to Feature Aluminum Frame Instead of Titanium Frame

Tuesday February 18, 2025 12:02 pm PST by
Over the years, Apple has switched from an aluminum frame to a stainless steel frame to a titanium frame for its highest-end iPhones. And now, it has been rumored that Apple will go back to using aluminum for three out of four iPhone 17 models. In an investor note with research firm GF Securities, obtained by MacRumors this week, Apple supply chain analyst Jeff Pu said the iPhone 17, iPhone...
apple launch feb 2025 alt

Here Are the New Apple Products We're Still Expecting This Spring

Thursday February 20, 2025 5:06 am PST by
Now that Apple has announced its new more affordable iPhone 16e, our thoughts turn to what else we are expecting from the company this spring. There are three product categories that we are definitely expecting to get upgraded before spring has ended. Keep reading to learn what they are. If we're lucky, Apple might make a surprise announcement about a completely new product category. M4...
Generic iOS 18

Here's When Apple Will Release iOS 18.4

Wednesday February 19, 2025 11:38 am PST by
Following the launch of the iPhone 16e, Apple updated its iOS 18, iPadOS 18, and macOS Sequoia pages to give a narrower timeline on when the next updates are set to launch. All three pages now state that new Apple Intelligence features and languages will launch in early April, an update from the more broader April timeframe that Apple provided before. The next major point updates will be iOS ...
apple launch feb 2025

Tim Cook Teases an 'Apple Launch' Next Wednesday

Thursday February 13, 2025 8:07 am PST by
In a social media post today, Apple CEO Tim Cook teased an upcoming "launch" of some kind scheduled for Wednesday, February 19. "Get ready to meet the newest member of the family," he said, with an #AppleLaunch hashtag. The post includes a short video with an animated Apple logo inside a circle. Cook did not provide an exact time for the launch, or share any other specific details, so...
apple c1

Apple Unveils 'C1' as First Custom Cellular Modem

Wednesday February 19, 2025 8:08 am PST by
Apple today announced its first custom cellular modem with the name "C1," debuting in the all-new iPhone 16e. The new modem contributes to the iPhone 16e's power efficiency, giving it the longest battery life of any iPhone with a 6.1-inch display, such as the iPhone 15 and iPhone 16. Expanding the benefits of Apple silicon, C1 is the first modem designed by Apple and the most...
Apple Northbrook

Apple Store Permanently Closing at Struggling Mall in Chicago Area

Tuesday February 18, 2025 8:46 pm PST by
Apple is permanently closing its retail store at the Northbrook Court shopping mall in the Chicago area. The company confirmed the upcoming closure today in a statement, but it has yet to provide a closing date for the location. Apple Northbrook opened in 2005, and the store moved to a larger space in the mall in 2017. Apple confirmed that affected employees will continue to work for the...

Top Rated Comments

thadoggfather Avatar
46 months ago
It is confusing... but they are gaslighting us into thinking it is universal confusion when there is a large subset of people with clear understanding coupled with dissent
Score: 127 Votes (Like | Disagree)
AndiG Avatar
46 months ago
[HEADING=2]Federighi just doesn’t understand a simple fact. If you don‘t need a local scanning system - don‘t build one.[/HEADING]
Score: 112 Votes (Like | Disagree)
xxray Avatar
46 months ago

Because it's on the [phone], security researchers are constantly able to introspect what’s happening in Apple’s [phone] software. So if any changes were made that were to expand the scope of this in some way —in a way that we had committed to not doing—there's verifiability, they can spot that that's happening.
How does this change the fact at all that there’s now essentially a new backdoor to be abused that’s installed in iOS 15?

Stop defending and get rid of this BS, Apple.
Score: 107 Votes (Like | Disagree)
6787872 Avatar
46 months ago
"but think of the children" has been used for decades now to erode privacy.

people seem to think if you are against it, you support it. they know exactly what they are doing.
Score: 97 Votes (Like | Disagree)
JPSaltzman Avatar
46 months ago
The more you have to "explain" over and over some new controversial "feature", the more it begins to stink.

(And I don't even use iCloud to store anything important -- especially photos!)
Score: 88 Votes (Like | Disagree)
Mac Fly (film) Avatar
46 months ago
So someone who looks at child porn photos stops using iCloud Photos. What about the rest of us who want privacy? What about future governmental interference?
Score: 81 Votes (Like | Disagree)