Apple is putting machine learning into everything it does
Apple’s SVP AI and Machine Learning, John Giannandrea, sat with Ars Technica to discuss the company’s efforts in Apple Silicon and AI.
You should read the report (recommended), but here’s a top-line summary for the record, and some thoughts on Apple Silicon.
On Apple Silicon and on-device AI
In time, what Giannandrea said about Apple Silicon will emerge to be the most profound articulation of what the company hopes to do in AI and machine learning on its platforms.
As we all know, Apple Silicon integrates a machine learning processor on the chip, that’s what makes Photos and other iPhone apps so smart. This integrated support for AI enables on-device analysis, powerful privacy and excellent (low power) performance.
Apple’s decision to put the Mac on Apple Silicon means the computer will likely develop additional AI-based talents.
Giannandrea pretty much confirmed this, saying this relates to the Apple Neural Engine (which powers on device AI). He was joined by Apple VP marketing, Bob Borchers, who noted:
“We will for the first time have a common platform, a silicon platform that can support what we want to do and what our developers want to do.
“That capability will unlock some interesting things that we can think of, but probably more importantly will unlock lots of things for other developers as they go along.”
If you use your Mac for any kind of business process, or for any regularly transacted consumer-grade tasks, don’t be too surprised if in the relatively near future your Mac starts offering to complete those tasks for you, or to augment them as you do.
Don’t neglect the AI chief’s observation that the Apple Neural Engine chip is “totally scalable”.
That means there’s a bigger such chip in an iPad than an iPhone or Apple Watch. Think how that translates into on-board intelligence on your Mac.
Why on-device AI is better by design
If your device is attempting to use AI to supplement your work in any particular task then how does it handle the data?
Some existing solutions use server-based intelligence to deliver such insights, but the problem with this is that if 7 billion people are all uploading their photos to the AI engine at the same time, that’s insanely wasteful around both data bandwidth and extraordinarily dangerous in terms of privacy.
Not to mention the impact of all the energy required on the edge devices, across the ‘net to transmit all that data, and on the end service provider’s servers.
Look at it in those terms and such services are just not good design.
This is what Giannandrea seems to think, telling Ars Technica: “It’s better to run the model close to the data, rather than moving the data around.” He also noted the importance of privacy and how this is protected by keeping information only on the device.
“There are lots of experiences that you would want to build that are better done at the edge device,” he said.
10 ways you already use Apple AI
- It’s AI that can distinguish the difference between an accidental or deliberate tap on an iPad screen.
- AI manages your batter, protecting energy and optimizing charge
- Those app recommendations? That’s AI.
- Speech recognition on Siri.
- Photos galleries and recommending the best image from a burst of them.
- The reason those images you take with an iPhone are so often much better than those you might grab with a higher megapixel camera? That’s AI, too – the image processor figures out how to optimize the information that is there.
- Keyboard prediction? AI.
- Handwashing on Apple Watch? Ditto.
- Even the capacity to blur the background out of an image.
- The way your watch figures out what workout you are doing? This is also based on AI. As is pose estimation as used across multiple apps.
On the growing importance of AI at Apple
Giannandrea hasn’t been at Apple so long. He used to work at Google.
He explains that he already used an iPad and Pencil when he joined Apple, but was surprised to discover there was not machine learning team working on handwriting using these devices.
He created such a team, which is why you can now look forward to using Scribble on your iPad.
There is now practically “no part” of Apple that is not engaging with ML and AI he said, claiming few problems attracting world class AI developers to Apple, partially because it’s so clear that AI has a part to play in future Apple experiences. Apple continues to make deep investments in the space.
“I really honestly think there’s not a corner of iOS or Apple experiences that will not be transformed by machine learning over the coming few years.”
This extends to third party developers, who will be able to make increasing use of machine learning models thanks to CoreML.
Now read the full report which is festooned with deeper insights over at Ars Technica.