Security Researchers Express Alarm Over Apple's Plans to Scan iCloud Images, But Practice Already Widespread

Apple today announced that with the launch of iOS 15 and iPadOS 15, it will begin scanning iCloud Photos in the U.S. to look for known Child Sexual Abuse Material (CSAM), with plans to report the findings to the National Center for Missing and Exploited Children (NCMEC).

Child Safety Feature
Prior to when Apple detailed its plans, news of the CSAM initiative leaked, and security researchers have already begun expressing concerns about how Apple's new image scanning protocol could be used in the future, as noted by Financial Times.

Apple is using a "NeuralHash" system to compare known CSAM images to photos on a user's iPhone before they're uploaded to iCloud. If there is a match, that photograph is uploaded with a cryptographic safety voucher, and at a certain threshold, a review is triggered to check if the person has CSAM on their devices.

At the current time, Apple is using its image scanning and matching technology to look for child abuse, but researchers worry that in the future, it could be adapted to scan for other kinds of imagery that are more concerning, like anti-government signs at protests.

In a series of tweets, Johns Hopkins cryptography researcher Matthew Green said that CSAM scanning is a "really bad idea" because in the future, it could expand to scanning end-to-end encrypted photos rather than just content that's uploaded to ‌iCloud‌. For children, Apple is implementing a separate scanning feature that looks for sexually explicit content directly in iMessages, which are end-to-end encrypted.

Green also raised concerns over the hashes that Apple plans to use because there could potentially be "collisions," where someone sends a harmless file that shares a hash with CSAM and could result in a false flag.

Apple for its part says that its scanning technology has an "extremely high level of accuracy" to make sure accounts are not incorrectly flagged, and reports are manually reviewed before a person's ‌iCloud‌ account is disabled and a report is sent to NCMEC.

Green believes that Apple's implementation will push other tech companies to adopt similar techniques. "This will break the dam," he wrote. "Governments will demand it from everyone." He compared the technology to "tools that repressive regimes have deployed."


Security researcher Alec Muffett, who formerly worked at Facebook, said that Apple's decision to implement this kind of image scanning was a "huge and regressive step for individual privacy." "Apple are walking back privacy to enable 1984," he said.

Ross Anderson, professor of security engineering at the University of Cambridge said called it an "absolutely appalling idea" that could lead to "distributed bulk surveillance" of devices.

As many have pointed out on Twitter, multiple tech companies already do image scanning for CSAM. Google, Twitter, Microsoft, Facebook, and others use image hashing methods to look for and report known images of child abuse.


It's also worth noting that Apple was already scanning some content for child abuse images prior to the rollout of the new CSAM initiative. In 2020, Apple chief privacy officer Jane Horvath said that Apple used screening technology to look for illegal images and then disables accounts if evidence of CSAM is detected.

Apple in 2019 updated its privacy policies to note that it would scan uploaded content for "potentially illegal content, including child sexual exploitation material," so today's announcements are not entirely new.

Note: Due to the political or social nature of the discussion regarding this topic, the discussion thread is located in our Political News forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.

Popular Stories

Apple iPhone 16e Feature

Apple Announces iPhone 16e With A18 Chip and Apple Intelligence, Pricing Starts at $599

Wednesday February 19, 2025 8:02 am PST by
Apple today introduced the iPhone 16e, its newest entry-level smartphone. The device succeeds the third-generation iPhone SE, which has now been discontinued. The iPhone 16e features a larger 6.1-inch OLED display, up from a 4.7-inch LCD on the iPhone SE. The display has a notch for Face ID, and this means that Apple no longer sells any iPhones with a Touch ID fingerprint button, marking the ...
iphone 17 pro asherdipps

iPhone 17 Pro Models Rumored to Feature Aluminum Frame Instead of Titanium Frame

Tuesday February 18, 2025 12:02 pm PST by
Over the years, Apple has switched from an aluminum frame to a stainless steel frame to a titanium frame for its highest-end iPhones. And now, it has been rumored that Apple will go back to using aluminum for three out of four iPhone 17 models. In an investor note with research firm GF Securities, obtained by MacRumors this week, Apple supply chain analyst Jeff Pu said the iPhone 17, iPhone...
apple launch feb 2025 alt

Here Are the New Apple Products We're Still Expecting This Spring

Thursday February 20, 2025 5:06 am PST by
Now that Apple has announced its new more affordable iPhone 16e, our thoughts turn to what else we are expecting from the company this spring. There are three product categories that we are definitely expecting to get upgraded before spring has ended. Keep reading to learn what they are. If we're lucky, Apple might make a surprise announcement about a completely new product category. M4...
Generic iOS 18

Here's When Apple Will Release iOS 18.4

Wednesday February 19, 2025 11:38 am PST by
Following the launch of the iPhone 16e, Apple updated its iOS 18, iPadOS 18, and macOS Sequoia pages to give a narrower timeline on when the next updates are set to launch. All three pages now state that new Apple Intelligence features and languages will launch in early April, an update from the more broader April timeframe that Apple provided before. The next major point updates will be iOS ...
apple launch feb 2025

Tim Cook Teases an 'Apple Launch' Next Wednesday

Thursday February 13, 2025 8:07 am PST by
In a social media post today, Apple CEO Tim Cook teased an upcoming "launch" of some kind scheduled for Wednesday, February 19. "Get ready to meet the newest member of the family," he said, with an #AppleLaunch hashtag. The post includes a short video with an animated Apple logo inside a circle. Cook did not provide an exact time for the launch, or share any other specific details, so...
apple c1

Apple Unveils 'C1' as First Custom Cellular Modem

Wednesday February 19, 2025 8:08 am PST by
Apple today announced its first custom cellular modem with the name "C1," debuting in the all-new iPhone 16e. The new modem contributes to the iPhone 16e's power efficiency, giving it the longest battery life of any iPhone with a 6.1-inch display, such as the iPhone 15 and iPhone 16. Expanding the benefits of Apple silicon, C1 is the first modem designed by Apple and the most...
Apple Northbrook

Apple Store Permanently Closing at Struggling Mall in Chicago Area

Tuesday February 18, 2025 8:46 pm PST by
Apple is permanently closing its retail store at the Northbrook Court shopping mall in the Chicago area. The company confirmed the upcoming closure today in a statement, but it has yet to provide a closing date for the location. Apple Northbrook opened in 2005, and the store moved to a larger space in the mall in 2017. Apple confirmed that affected employees will continue to work for the...

Top Rated Comments

macrumorsuser10 Avatar
46 months ago
Apple should add scanning for:

1. Photos of the confederate flag.
2. Photos of people not wearing Covid masks.
3. Photos of Chinese people disrespecting the Chinese government.
4. Photos of Middle eastern women not wearing burkas.
5. Photos of a group of people with too many whites, not enough blacks.
Score: 74 Votes (Like | Disagree)
Bandaman Avatar
46 months ago

If you're not doing anything wrong, then you have nothing to worry about.
This is always the de facto standard for terrible replies to privacy.
Score: 74 Votes (Like | Disagree)
cloudyo Avatar
46 months ago

If you're not doing anything wrong, then you have nothing to worry about.
You should let law enforcement install cameras in your home then, so they can make sure you are not doing anything illegal while you take a shower, for example. After all, you have nothing to hide, do you?
Score: 57 Votes (Like | Disagree)
Bawstun Avatar
46 months ago

If you're not doing anything wrong, then you have nothing to worry about.
This simply isn’t true. As the article notes, the technology can easily be changed to other things in the future - what if they scanned for BLM supporter images or anti-government images? What if they wanted to scan and track certain political parties?

It’s not about child sex material, everyone agrees that that is wrong, it’s about passing over more and more of our rights to Big Tech. Give them an inch and they’ll take a foot.
Score: 51 Votes (Like | Disagree)
contacos Avatar
46 months ago

If you're not doing anything wrong, then you have nothing to worry about.
Depends on the definition of „wrong“. Sometimes it is up to self serving definitions of dictators
Score: 50 Votes (Like | Disagree)
jarman92 Avatar
46 months ago
"Other companies already participate in this outrageous invasion of privacy" is not nearly the defense of Apple these people seem to think it is.
Score: 48 Votes (Like | Disagree)