18 things we learned about Apple, Siri and AI today

A frame from the wonderful BBC TV original series based on Douglas Adams' 'Hitchiker's Guide to the Galaxy'.

A frame from the wonderful BBC TV original series based on Douglas Adams’ ‘Hitchiker’s Guide to the Galaxy’.

Hats off to Steven Levy who has written an important and exclusive report exploring Apple’s adventures in machine intelligence , “The iBrain is Here”. (LINK).

It’s a fascinating insight into Apple’s Siri AI and is jam packed with new insights into the technology that we didn’t have before, and proves once and for all that Apple has been investing significantly in artificial intelligence for a very long time. (Like I’ve been saying).

Here’s a very quick summary:

  • Steve Jobs was instrumental in acquiring Siri in 2010
  • July 30, 2014 is when Siri moved to a neural-net-based system that uses advanced neural AI techniques to generate much more accurate answers than before.
  • Siri’s predictive intelligence leverages technologies acquired when Apple purchased the small Cue startup in 2013.
  • The number of Apple products and services already or about to begin using machine learning extends across two full pages.
  • That thing when you get a call from someone you don’t know but the iPhone tells you who it thinks it might be based on numbers included in a recent email? That’s AI.
  • Apple uses deep learning to detect fraud on its store
  • It also uses intelligence to maximize battery life and identify useful beta test reports.
  • Apple’s machine intelligence also shows up when migrating between Wi-Fi and mobile networks and when automatically creating photo galleries.
  • Apple Watch uses machine intelligence to figure out when you are walking, rather than simply ambling about
  • Apple has been using machine learning since the early ‘90’s with Newton’s handwriting recognition.
  • The AI inside an iPhone occupies around 200MB and constantly deletes old data
  • The AI on your iPhone carries a huge host of local data that helps everything, from Maps to Autocorrect
  • When you use an Apple Pencil it is machine learning that helps your system tell the difference between different kinds of touch
  • Making Siri work effectively on Apple devices is also part of the technology and design decisions taken on the products
  • Siri is smart enough to predict who will win the game
  • In iOS 10, Siri has had an overhaul. It should now sound more like a normal person, because it is using a deep neural network to put what it says together and articulate it.
  • And Apple has figured out how to do all these things without knowing much about you – maintaining your privacy is part of its product, just like destroying your privacy is part of other vendor’s business plans
  • Apple’s scientists working on differential privacy will publish a paper on their work, because the company recognizes its importance.

“We use these techniques to do the things we have always wanted to do, better than we’ve been able to do,” says Apple’s marketing chief, Phil Schiller. As I observed here.

If you’ve come this far you really should read Levy’s report. It has lots more information.

Jonny Evans

Watching Apple since 1999. I don't say what they should do. I say what they might do. They sometimes do.

2 Responses

  1. September 2, 2016

    […] – but also the introduction of 3D Touch, Live Photos, 4K video, faster Touch ID plus all the headline Siri and hardware improvements Apple hasn’t even announced yet (such as more base storage, for […]

  2. May 30, 2017

    […] the main processor to handle everything your smartphones already do, but will be able to handle machine intelligence functions natively on the […]

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.