Apple Says NeuralHash Tech Impacted by 'Hash Collisions' Is Not the Version Used for CSAM Detection

Developer Asuhariet Yvgar this morning said that he had reverse-engineered the NeuralHash algorithm that Apple is using to detect Child Sexual Abuse Materials (CSAM) in iCloud Photos, posting evidence on GitHub and details on Reddit.

Child Safety Feature Blue
Yvgar said that he reverse-engineered the NeuralHash algorithm from iOS 14.3, where the code was hidden, and he rebuilt it in Python. After he uploaded his findings, another user was able to create a collision, an issue where two non-matching images share the same hash. Security researchers have warned about this possibility because the potential for collisions could allow the CSAM system to be exploited.

In a statement to Motherboard, Apple said that the version of the NeuralHash that Yvgar reverse-engineered is not the same as the final implementation that will be used with the CSAM system. Apple also said that it made the algorithm publicly available for security researchers to verify, but there is a second private server-side algorithm that verifies a CSAM match after the threshold is exceeded, along with human verification.

Apple however told Motherboard in an email that that version analyzed by users on GitHub is a generic version, and not the one final version that will be used for iCloud Photos CSAM detection. Apple said that it also made the algorithm public.

"The NeuralHash algorithm [... is] included as part of the code of the signed operating system [and] security researchers can verify that it behaves as described," one of Apple's pieces of documentation reads. Apple also said that after a user passes the 30 match threshold, a second non-public algorithm that runs on Apple's servers will check the results.

Matthew Green, who teaches cryptography at Johns Hopkins University and who has been a vocal critic of Apple's CSAM system, told Motherboard that if collisions "exist for this function," then he expects "they'll exist in the system Apple eventually activates."

"Of course, it's possible that they will re-spin the hash function before they deploy," he said. "But as a proof of concept, this is definitely valid," he said of the information shared on GitHub.

Because of the human element, though, another researcher, Nicholas Weaver, told Motherboard that all people can do with manipulating non-CSAM hashes into CSAM is "annoy Apple's response team with garbage images until they implement a filter" to get rid of false positives. Actually fooling Apple's system would also require access to the hashes provided by NCMEC and it would require the production of over 30 colliding images, with the end result not fooling the human oversight.

Apple is using its NeuralHash system to match a database of image hashes provided by agencies like the National Center for Missing Children (NCMEC) to images on user devices to search for CSAM. The system is designed to produce exact matches and Apple says there's a one in a trillion chance that an iCloud account can be accidentally flagged.

Apple is planning to implement the NeuralHash CSAM system in iOS and iPadOS 15 as part of a suite of child safety features, and it has been a hugely controversial decision with Apple receiving criticism from customers and privacy advocates. Apple has been attempting to reassure customers and security researchers about the implementation of the system with additional documentation and executive interviews.

Popular Stories

iPhone SE 4 Thumb 1

iPhone SE 4 With Apple's Own 5G Modem 'Confirmed' to Launch in March

Tuesday November 19, 2024 12:12 pm PST by
Barclays analyst Tom O'Malley and his colleagues recently traveled to Asia to meet with various electronics manufacturers and suppliers. In a research note this week, outlining key takeaways from the trip, the analysts said they have "confirmed" that a fourth-generation iPhone SE with an Apple-designed 5G modem is slated to launch towards the end of the first quarter next year. In line with previo...
airtag purple

AirTag 2 Rumored to Launch Next Year With These New Features

Sunday November 17, 2024 5:18 am PST by
Apple released the AirTag in April 2021, so it is now three over and a half years old. While the AirTag has not received any hardware updates since then, a new version of the item tracking accessory is rumored to be in development. Below, we recap rumors about a second-generation AirTag. Timing Apple is aiming to release a new AirTag in mid-2025, according to Bloomberg's Mark Gurman....
Magic Mouse Next to Keyboard

No, Apple CEO Tim Cook Didn't Say He Prefers Logitech's MX Master 3 Over the Magic Mouse

Sunday November 17, 2024 3:03 pm PST by
While the Logitech MX Master 3 is a terrific mouse for the Mac, reports claiming that Apple CEO Tim Cook prefers that mouse over the Magic Mouse are false. The Wall Street Journal last month published an interview with Cook, in which he said he uses every Apple product every day. Soon after, The Verge's Wes Davis attempted to replicate using every Apple product in a single day. During that...
Generic iOS 18 Feature Real Mock

Apple Releases iOS 18.1.1 and iPadOS 18.1.1 With Security Fixes

Tuesday November 19, 2024 10:10 am PST by
Apple today released iOS 18.1.1 and iPadOS 18.1.1, minor updates to the iOS 18 and iPadOS 18 operating systems that debuted earlier in September. iOS 18.1.1 and iPadOS 18.1.1 come three weeks after the launch of iOS 18.1. The new software can be downloaded on eligible iPhones and iPads over-the-air by going to Settings > General > Software Update. Apple has also released iOS 17.7.2 for...
at t turbo indicator iphone 16 pro max v0 8hrh7w5f3w1e1

AT&T Turbo Indicator Showing Up in iPhone Status Bar for Subscribers

Wednesday November 20, 2024 3:42 am PST by
AT&T has begun displaying "Turbo" in the iPhone carrier label for customers subscribed to its premium network prioritization service, according to reports on Reddit. The new indicator seems to have started appearing after users updated to iOS 18.1.1, but that could be just coincidence. Image credit: Reddit user No_Highlight7476 The Turbo feature provides enhanced network performance through ...
iPhone 17 Slim Feature Single Camera 1 Redux

'iPhone 17 Air' Rumored to Surpass iPhone 6 as Thinnest iPhone Ever

Monday November 18, 2024 1:07 pm PST by
In a research note with Hong Kong-based investment bank Haitong today, obtained by MacRumors, Apple analyst Jeff Pu said he agrees with a recent rumor claiming that the so-called "iPhone 17 Air" will be around 6mm thick. "We agreed with the recent chatter of an 6mm thickness ultra-slim design of the iPhone 17 Slim model," he wrote. If that measurement proves to be accurate, there would be ...
bug security vulnerability issue fix larry

Make Sure to Update: iOS 18.1.1 and macOS Sequoia 15.1.1 Fix Actively Exploited Vulnerabilities

Tuesday November 19, 2024 10:52 am PST by
The iOS 18.1.1, iPadOS 18.1.1, and macOS Sequoia 15.1.1 updates that Apple released today address JavaScriptCore and WebKit vulnerabilities that Apple says have been actively exploited on some devices. With the JavaScriptCore vulnerability, processing maliciously crafted web content could lead to arbitrary code execution. The WebKit vulnerability had the same issue with maliciously crafted...

Top Rated Comments

locovaca Avatar
43 months ago
Make it all public. It’s for the children, right? What do you have to hide, Apple?
Score: 63 Votes (Like | Disagree)
miniyou64 Avatar
43 months ago
People giving Apple the benefit of the doubt here are making a tremendous amount of assumptions. This kind of tech never remains only for its intended use. No matter which way you spin it (for the children!) this is invasive. Someone on Twitter mentioned what happens if someone airdrops you a bunch of illicit photos and they sync to iCloud in a matter of seconds? Boom you’re flagged. There’s 1,000,000 ways for this system to go wrong or be exploited or worse ruin innocent peoples lives. And if you do end up being one of those people, you will have exactly zero recourse to prove your innocence. It’s over for you. This entire thing is very stupid on Apple’s part.
Score: 58 Votes (Like | Disagree)
zakarhino Avatar
43 months ago
*amateur devs exploit the system within a few hours of discovery*

Apple: "Uhhh guys, this is totally not the finished algorithm! Believe us!"
Score: 47 Votes (Like | Disagree)
dguisinger Avatar
43 months ago
As Rene Ritchie says on MacBreak Weekly, Apple keeps talking down to us as if we don't understand, and our response is "You don't understand, we understand and do not like this"
Score: 42 Votes (Like | Disagree)
nawk Avatar
43 months ago
Wait - I’m confused by this. Does this mean that everyone’s iCloud is going to be scanned without user’s authorization in the name of child welfare??

While I am sure people may agree with this, it seems like one step away from doctors/dentists submitting DNA samples of *every* patient because it is in the public interest.

This program seems just a small morale slip away from being an invasion of privacy on a monumental scale. Give what Snowden revealed the US government has a huge thirst for data collection like this. It’s a short hop to scan for compromising photos of your political rivals, yes?
Score: 42 Votes (Like | Disagree)
LDN Avatar
43 months ago

Make it all public. It’s for the children, right? What do you have to hide, Apple?
Yeah, this is increasingly sounding like its not meant for children. Not forgetting the fact that actual pedos, who are likely very small in number, will either just turn off iCloud Photos or get Android phones. Normal people are left stuck with the bill. Apple are going to have to pull this whole thing.
Score: 39 Votes (Like | Disagree)