Apple's Machine Learning Has Cut Siri's Error Rate by a Factor of Two

Steven Levy has published an in-depth article about Apple's artificial intelligence and machine learning efforts, after meeting with senior executives Craig Federighi, Eddy Cue, Phil Schiller, and two Siri scientists at the company's headquarters.

backchannel-apple-machine-learning
Apple provided Levy with a closer look at how machine learning is deeply integrated into Apple software and services, led by Siri, which the article reveals has been powered by a neural-net based system since 2014. Apple said the backend change greatly improved the personal assistant's accuracy.

"This was one of those things where the jump was so significant that you do the test again to make sure that somebody didn’t drop a decimal place," says Eddy Cue, Apple’s senior vice president of internet software and services.

Alex Acero, who leads the Siri speech team at Apple, said Siri's error rate has been lowered by more than a factor of two in many cases.

“The error rate has been cut by a factor of two in all the languages, more than a factor of two in many cases,” says Acero. “That’s mostly due to deep learning and the way we have optimized it — not just the algorithm itself but in the context of the whole end-to-end product.”

Acero told Levy he was able to work directly with Apple's silicon design team and the engineers who write the firmware for iOS devices to maximize performance of the neural network, and Federighi added that Apple building both hardware and software gives it an "incredible advantage" in the space.

"It's not just the silicon," adds Federighi. "It's how many microphones we put on the device, where we place the microphones. How we tune the hardware and those mics and the software stack that does the audio processing. It's all of those pieces in concert. It's an incredible advantage versus those who have to build some software and then just see what happens."

Apple's machine learning efforts extend far beyond Siri, as evidenced by several examples shared by Levy:

You see it when the phone identifies a caller who isn’t in your contact list (but did email you recently). Or when you swipe on your screen to get a shortlist of the apps that you are most likely to open next. Or when you get a reminder of an appointment that you never got around to putting into your calendar. Or when a map location pops up for the hotel you’ve reserved, before you type it in. Or when the phone points you to where you parked your car, even though you never asked it to. These are all techniques either made possible or greatly enhanced by Apple’s adoption of deep learning and neural nets.

Another product born out of machine learning is the Apple Pencil, which can detect the difference between a swipe, a touch, and a pencil input:

In order for Apple to include its version of a high-tech stylus, it had to deal with the fact that when people wrote on the device, the bottom of their hand would invariably brush the touch screen, causing all sorts of digital havoc. Using a machine learning model for “palm rejection” enabled the screen sensor to detect the difference between a swipe, a touch, and a pencil input with a very high degree of accuracy. “If this doesn’t work rock solid, this is not a good piece of paper for me to write on anymore — and Pencil is not a good product,” says Federighi. If you love your Pencil, thank machine learning.

On the iPhone, machine learning is enabled by a localized dynamic cache or "knowledge base" that Apple says is around 200MB in size, depending on how much personal information is stored.

This includes information about app usage, interactions with other people, neural net processing, a speech modeler, and "natural language event modeling." It also has data used for the neural nets that power object recognition, face recognition, and scene classification.

"It's a compact, but quite thorough knowledge base, with hundreds of thousands of locations and entities. We localize it because we know where you are," says Federighi. This knowledge base is tapped by all of Apple's apps, including the Spotlight search app, Maps, and Safari. It helps on auto-correct. "And it's working continuously in the background," he says.

Apple, for example, uses its neural network to capture the words iPhone users type using the standard QuickType keyboard.

Other information Apple stores on devices includes probably the most personal data that Apple captures: the words people type using the standard iPhone QuickType keyboard. By using a neural network-trained system that watches while you type, Apple can detect key events and items like flight information, contacts, and appointments — but information itself stays on your phone.

Apple insists that much of the machine learning occurs entirely local to the device, without personal information being sent back to its servers.

"Some people perceive that we can't do these things with AI because we don't have the data," says Cue. "But we have found ways to get that data we need while still maintaining privacy. That's the bottom line."

"We keep some of the most sensitive things where the ML is occurring entirely local to the device," Federighi says. As an example, he cites app suggestions, the icons that appear when you swipe right.

The full-length article on Backchannel provides several more details about how machine learning and artificial intelligence work at Apple.

Popular Stories

iOS 26

When Will Apple Release iOS 26.2?

Monday December 1, 2025 4:37 pm PST by
We're getting closer to the launch of the final major iOS update of the year, with Apple set to release iOS 26.2 in December. We've had three betas so far and are expecting a fourth beta or a release candidate this week, so a launch could follow as soon as next week. Past Launch Dates Apple's past iOS x.2 updates from the last few years have all happened right around the middle of the...
ios 18 to ios 26 upgrade

Apple Pushes iPhone Users Still on iOS 18 to Upgrade to iOS 26

Tuesday December 2, 2025 11:09 am PST by
Apple is encouraging iPhone users who are still running iOS 18 to upgrade to iOS 26 by making the iOS 26 software upgrade option more prominent. Since iOS 26 launched in September, it has been displayed as an optional upgrade at the bottom of the Software Update interface in the Settings app. iOS 18 has been the default operating system option, and users running iOS 18 have seen iOS 18...
maxresdefault

iPhone Fold: Launch, Pricing, and What to Expect From Apple's Foldable

Monday December 1, 2025 3:00 am PST by
Apple is expected to launch a new foldable iPhone next year, based on multiple rumors and credible sources. The long-awaited device has been rumored for years now, but signs increasingly suggest that 2026 could indeed be the year that Apple releases its first foldable device. Subscribe to the MacRumors YouTube channel for more videos. Below, we've collated an updated set of key details that ...
iOS 26

Apple Seeds iOS 26.2 and iPadOS 26.2 Release Candidates to Developers and Public Beta Testers

Wednesday December 3, 2025 10:33 am PST by
Apple today seeded the release candidate versions of upcoming iOS 26.2 and iPadOS 26.2 updates to developers and public beta testers, with the software coming two weeks after Apple seeded the third betas. The release candidates represent the final versions of iOS 26.2 and iPadOS 26.2 that will be provided to the public if no further bugs are found during this final week of testing....
iphone 17 cyber

iPhone 17 Demand Is Breaking Apple's Sales Records

Tuesday December 2, 2025 9:44 am PST by
Apple's iPhone 17 lineup is selling well enough that Apple is on track to ship more than 247.4 million total iPhones in 2025, according to a new report from IDC. Total 2025 shipments are forecast to grow 6.1 percent year over year due to iPhone 17 demand and increased sales in China, a major market for Apple. Overall worldwide smartphone shipments across Android and iOS are forecast to...
Photos App Icon Liquid Glass

John Gruber Shares Scathing Commentary About Apple's Departing Software Design Chief

Thursday December 4, 2025 9:30 am PST by
In a statement shared with Bloomberg on Wednesday, Apple confirmed that its software design chief Alan Dye will be leaving. Apple said Dye will be succeeded by Stephen Lemay, who has been a software designer at the company since 1999. Meta CEO Mark Zuckerberg announced that Dye will lead a new creative studio within the company's AR/VR division Reality Labs. On his blog Daring Fireball,...
Touchscreen MacBook Feature

Here Are the Four MacBooks Apple Is Expected to Launch Next Year

Monday December 1, 2025 5:00 am PST by
2026 could be a bumper year for Apple's Mac lineup, with the company expected to announce as many as four separate MacBook launches. Rumors suggest Apple will court both ends of the consumer spectrum, with more affordable options for students and feature-rich premium lines for users that seek the highest specifications from a laptop. Below is a breakdown of what we're expecting over the next ...
iphone air camera

iPhone Air's Resale Value Has Dropped Dramatically, Data Shows

Thursday December 4, 2025 5:27 am PST by
The iPhone Air has recorded the steepest early resale value drop of any iPhone model in years, with new data showing that several configurations have lost almost 50% of their value within ten weeks of launch. According to a ten-week analysis published by SellCell, Apple's latest lineup is showing a pronounced split in resale performance between the iPhone 17 models and the iPhone Air....
iPhone 17 Pro Cosmic Orange

iPhone 17 Pro Lost a Camera Feature Pro Models Have Had Since 2020

Thursday December 4, 2025 5:18 am PST by
iPhone 17 Pro models, it turns out, can't take photos in Night mode when Portrait mode is selected in the Camera app – a capability that's been available on Apple's Pro devices since the iPhone 12 Pro in 2020. If you're an iPhone 17 Pro or iPhone 17 Pro Max owner, try it for yourself: Open the Camera app with Photo selected in the carousel, then cover the rear lenses with your hand to...
chatgpt logo

Sam Altman Declares 'Code Red' for ChatGPT, Delays OpenAI Advertising Plans

Tuesday December 2, 2025 3:30 pm PST by
OpenAI is deprioritizing work on advertising as it focuses on improving the quality of ChatGPT, reports The Information. OpenAI CEO Sam Altman declared a "code red" on Monday, and told employees that the company needs to improve ChatGPT so it doesn't fall behind competitors like Google and Anthropic. Altman said that OpenAI needs to work on personalization for each user, image generation,...

Top Rated Comments

Santabean2000 Avatar
121 months ago
Siri stills sucks for me... still doesn't work well enough to be useful.
Score: 32 Votes (Like | Disagree)
GfPQqmcRKUvP Avatar
121 months ago
Not in my experience. Siri is absolute trash compared to Google Now and Cortana. Apple should be embarrassed by how stagnant it has been and the repeated, correctable errors it makes. No amount of public relations whitewashing can fix it.
Score: 29 Votes (Like | Disagree)
Oblivious.Robot Avatar
121 months ago
Yeah, no.
Siri is still pretty much useless for me, although better than before, but still meh with her current capabilities.



Edit - For the screen brightness part, after some retries I found out that it does work, but you have to say it in one sentence as Siri forgets about it in the next. :confused:

Attachment Image
Score: 17 Votes (Like | Disagree)
keysofanxiety Avatar
121 months ago
Apple insists that much of the machine learning occurs entirely local to the device, without personal information being sent back to its servers.
And yet it still has no offline functionality, even if the learning is localised; well from what I can see, anyway. An iPhone 3GS managed offline voice functions like "call Tom". Why the heck can't Siri? Phone, text, open an app -- these are not commands that need to be bounced off a server.

It's fine if you're at Apple HQ, testing Siri on a 10Gb/s Internet connection. But when you're on the road with flaky 3G signal, it's the last thing you need.
Score: 11 Votes (Like | Disagree)
jsclem02 Avatar
121 months ago
The struggle with Siri is still very real.

Attachment Image
Score: 11 Votes (Like | Disagree)
thisisnotmyname Avatar
121 months ago
I like Siri on the iPhone when it works, but at times it totally gets things very wrong that it previously got right.

What really gets me ticked off is how Siri on ATV is worse than useless almost every time I try to make use of it.
I agree. On the phone she works well for me. On the AppleTV it seems like she goes to sleep and the first time I try to interact with her such as "what did he just say?" it lags so long that the ten second rewind is well past whatever I wanted to hear. And then there's the lack of Siri search through home sharing content...
Score: 7 Votes (Like | Disagree)