Shortly after Apple's iPhone X event this week, the company's silicon chief Johny Srouji and marketing chief Phil Schiller sat down for an interview about its new A11 Bionic chip with Mashable's editor-at-large Lance Ulanoff.
One interesting tidbit mentioned was that Apple began exploring and developing the core technologies in the A11 chip at least three years ago, when the iPhone 6 and iPhone 6 Plus launched with A8 chips.
Srouji told me that when Apple architects silicon, they start by looking three years out, which means the A11 Bionic was under development when Apple was shipping the iPhone 6 and its A8 chip. Back then we weren't even talking about AI and machine learning at a mobile level and, yet, Srouji said, "The neural engine embed, it’s a bet we made three years ahead."
Apple's three-year roadmap can change if new features are planned, like the Super Retina HD Display in iPhone X.
"The process is flexible to changes," said Srouji, who’s been with Apple since the first iPhone. If a team comes in with a request that wasn't part of the original plan, "We need to make that happen. We don't say, 'No, let me get back to my road map and, five years later, I'll give you something."
In fact, Schiller praised Srouji's team for its ability to "move heaven and earth" when the roadmap suddenly changes.
"There have been some critical things in the past few years, where we've asked Johny's team to do something on a different schedule, on a different plan than they had in place for years, and they moved heaven and earth and done it, and it's remarkable to see."
A11 Bionic six-core chip has two performance cores that are 25 percent faster, and four high-efficiency cores that are 70 percent faster, than the A10 chip in iPhone 7 and iPhone 7 Plus. Early benchmarks suggest the A11 Bionic is even on par with the performance of Apple's latest 13-inch MacBook Pro models.
The A11 chip is more efficient at multi-threaded tasks thanks to a second-generation performance controller that is able to access all six of the cores simultaneously if a particular task demands it.
Gaming might use more cores, said Srouji, but something as simple as predictive texting, where the system suggests the next word to type, can tap into the high-performance CPUs, as well.
The A11 chip also has an Apple-designed neural engine that handles facial recognition for Face ID and Animoji, and other machine learning algorithms. The dual-core engine recognizes people, places, and objects, and processes machine learning tasks at up to 600 billion operations per second, according to Apple.
“When you look at applications and software, there are certain algorithms that are better off using a functional programming model,” said Srouji.
This includes the iPhone X’s new face tracking and Face ID as well as the augmented-reality-related object detection. All of them use neural networks, machine learning or deep learning (which is part of machine learning). This kind of neural processing could run on a CPU or, preferably, a GPU. “But for these neural networking kinds of programming models, implementing custom silicon that’s targeted for that application, that will perform the exact same tasks, is much more energy efficient than a graphics engine,” said Srouji.
Apple's new iPhone 8, iPhone 8 Plus, and iPhone X are all equipped with an A11 chip.
In related news, Carnegie Mellon University's School of Computer Science has announced that Srouji will take part in a distinguished industry lecture on Monday, September 18 from 5:00 p.m. to 6:30 p.m. local time.
Full Interview: The Inside Story of the iPhone X 'Brain,' the A11 Bionic Chip