Apple Privacy Chief Explains 'Built-in' Privacy Protections in Child Safety Features Amid User Concerns

Apple's Head of Privacy, Erik Neuenschwander, has responded to some of users' concerns around the company's plans for new child safety features that will scan messages and Photos libraries, in an interview with TechCrunch.

Child Safety Feature Blue
When asked why Apple is only choosing to implement child safety features that scan for Child Sexual Abuse Material (CSAM), Neuenschwander explained that Apple has "now got the technology that can balance strong child safety and user privacy," giving the company "a new ability to identify accounts which are starting collections of known CSAM."

Neuenschwander was asked if, in retrospect, announcing the Communication Safety features in Messages and the CSAM detection system in iCloud Photos together was the right decision, to which he responded:

Well, while they are [two] systems they are also of a piece along with our increased interventions that will be coming in Siri and search. As important as it is to identify collections of known CSAM where they are stored in Apple's iCloud Photos service, it's also important to try to get upstream of that already horrible situation.

When asked if Apple was trying to demonstrate to governments and agencies around the world that it is possible to scan for illicit content while preserving user privacy, Neuenschwander explained:

Now, why to do it is because, as you said, this is something that will provide that detection capability while preserving user privacy. We're motivated by the need to do more for child safety across the digital ecosystem, and all three of our features, I think, take very positive steps in that direction. At the same time we're going to leave privacy undisturbed for everyone not engaged in the illegal activity.

He was asked if Apple had created a framework that could be used for law enforcement to scan for other kinds of content in users' libraries and if it undermines Apple's commitment to end-to-end encryption.

It doesn't change that one iota. The device is still encrypted, we still don't hold the key, and the system is designed to function on on-device data... The alternative of just processing by going through and trying to evaluate users data on a server is actually more amenable to changes [without user knowledge], and less protective of user privacy... It's those sorts of systems that I think are more troubling when it comes to the privacy properties — or how they could be changed without any user insight or knowledge to do things other than what they were designed to do.

Neuenschwander was then asked if Apple could be forced to comply with laws outside the United States that may force it to add things that are not CSAM to the database to check for them on-device, to which he explained that there are a "number of protections built-in" to the service.

The hash list is built into the operating system, we have one global operating system and don't have the ability to target updates to individual users and so hash lists will be shared by all users when the system is enabled. And secondly, the system requires the threshold of images to be exceeded so trying to seek out even a single image from a person's device or set of people's devices won't work because the system simply does not provide any knowledge to Apple for single photos stored in our service. And then, thirdly, the system has built into it a stage of manual review where, if an account is flagged with a collection of illegal CSAM material, an Apple team will review that to make sure that it is a correct match of illegal CSAM material prior to making any referral to any external entity. And so the hypothetical requires jumping over a lot of hoops, including having Apple change its internal process to refer material that is not illegal, like known CSAM and that we don't believe that there's a basis on which people will be able to make that request in the U.S. And the last point that I would just add is that it does still preserve user choice, if a user does not like this kind of functionality, they can choose not to use iCloud Photos and if iCloud Photos is not enabled, no part of the system is functional.

Neuenschwander continued that for users who are "not into this illegal behavior, Apple gain no additional knowledge about any user's cloud library," and "it leaves privacy completely undisturbed."

See TechCrunch's full interview with Neuenschwander for more information.

Popular Stories

iPhone 17 Pro 34ths Perspective

iPhone 17 Pro Launching Later This Year With These 10 New Features

Sunday March 23, 2025 10:00 am PDT by
While the iPhone 17 Pro and iPhone 17 Pro Max are not expected to launch until September, there are already plenty of rumors about the devices. Below, we recap key changes rumored for the iPhone 17 Pro models as of March 2025: Aluminum frame: iPhone 17 Pro models are rumored to have an aluminum frame, whereas the iPhone 15 Pro and iPhone 16 Pro models have a titanium frame, and the iPhone ...
Generic iOS 18

iOS 18.4 Coming Soon With These New Features for Your iPhone

Tuesday March 25, 2025 6:45 am PDT by
Apple is expected to release iOS 18.4 to the general public as soon as next week, following more than a month of beta testing. Apple's website says some iOS 18.4 features will be released in "early April," so the update should be out as early as Tuesday, April 1. Apple this week seeded the iOS 18.4 Release Candidate, which is typically the final beta version, barring the discovery of any...
ios 19 messages app

Here's What Apple's iOS 19 Messages App Might Look Like

Tuesday March 25, 2025 11:52 am PDT by
Leaker Jon Prosser today shared a mockup of what he says the Messages app will look like in iOS 19, demoing an interface with rounded, translucent bubble-shaped navigation buttons at the top and softer, rounder corners for the keyboard and word suggestions. Jon Prosser's Messages app mockup The return button, a button for going back to the Messages list, and the FaceTime button have a deeper...
iCloud General Feature Redux

iPhone Users Who Pay for iCloud Storage Receive a New Perk

Thursday March 20, 2025 12:01 am PDT by
If you pay for iCloud storage on your iPhone, Apple has a new perk for you, at no additional cost. The new perk is the ability to create invitations in the Apple Invites app for the iPhone, which launched in the App Store last month. In the Apple Invites app, iCloud+ subscribers can create invitations for any occasion, such as birthday parties, graduations, baby showers, and more. Anyone ...
macbook pro blue green

When Will Apple Release the M5 MacBook Pro?

Wednesday March 26, 2025 4:53 pm PDT by
Apple regularly refreshes the MacBook Pro models, and a new version that uses M5 series chips is in the works. Apple just finished refreshing most of the Mac lineup with M4 chips, and now it's time for the M5. Rumors suggest that we could see the first M5 MacBook Pro models this fall. Design There have been no rumors of a design update for the M5 MacBook Pro models that are coming this...
airpods max 2024 colors

Don't Buy Into Apple's Hype About AirPods Max Gaining Lossless Audio

Monday March 24, 2025 4:24 pm PDT by
Apple today announced that AirPods Max with a USB-C port will be gaining support for lossless audio and ultra-low latency audio with a firmware update next month, alongside the release of iOS 18.4, iPadOS 18.4, and macOS 15.4. For context, audio files are typically compressed to keep file sizes smaller. There are lossy compression standards like MP3 and AAC (Advanced Audio Codec), which...
Apple Lumon Terminal Pro

Apple's Mac Site Features Fictional 'Lumon Terminal Pro'

Wednesday March 26, 2025 12:19 pm PDT by
Apple is going all out with promotions for the popular Severance Apple TV+ show today, and as of right now, you'll find a new "Lumon Terminal Pro" listed on Apple's Mac site. The Lumon Terminal Pro is designed to look similar to the machines that Severance employees like Mark S. and Helly R. use for macrodata refinement. The Terminal features a blue keyboard, a small display with wide...
Generic iOS 19 Feature Mock

Gurman: Jon Prosser's iOS 19 Mockups 'Aren't Representative' of Redesign

Tuesday March 25, 2025 4:47 pm PDT by
The iOS 19 mockup images that leaker Jon Prosser shared today are not representative of the actual iOS 19 design, Bloomberg's Mark Gurman said on social media. According to Gurman, the images that are "floating around" are based on "very old builds" or "vague descriptions," and are lacking key features. Gurman says that we can "expect more from Apple in June." Gurman made the same comment ...

Top Rated Comments

jimbobb24 Avatar
47 months ago
Shorter “If your not breaking the law you have nothing to fear”.

I am sure am glad governments never change laws, have poorly defined laws, arbitrary enforcement, and executive orders/mandates etc that might change my status as a law abiding citizen at any moment.

Obviously this power could never be abused. Thank goodness. Go get those bad people with the pictures while the rest of us rest easy knowing they are not after us.
Score: 75 Votes (Like | Disagree)
LeeW Avatar
47 months ago

“If your not breaking the law you have nothing to fear”
The worst argument ever when it comes to privacy.
Score: 62 Votes (Like | Disagree)
Mebsat Avatar
47 months ago
As he states, it is clear that Apple will tolerate a single CSAM file within an iCloud Photos account. They designed it to do so. So what is the point of this? That fact alone gives law enforcement a battering ram to demand access to iCloud Photos. This feature does not preclude that there is CSAM stored in iCloud Photos. All Apple can claim is there is less CSAM in iCloud Photos.

If PR approved this disaster, firings must commence.
Score: 30 Votes (Like | Disagree)
TheYayAreaLiving ?️ Avatar
47 months ago
Bottom-line: They are scanning your iPhone. Whatever you store in iCloud.
Score: 25 Votes (Like | Disagree)
Cosmosent Avatar
47 months ago
Regardless as to how Apple tries to Spin It, the chances of an iOS 15 Boycott are now real !

I expect the iPhone 13 family to come pre-loaded with iOS 14.8.
Score: 24 Votes (Like | Disagree)
Jonas07 Avatar
47 months ago
Bunch of ******** to hide the fact they will scan your photos and messages, you have to be stupid to believe it will only for children between 0-12yo.
Score: 24 Votes (Like | Disagree)