15 ways to make Apple’s AirPods the smart genius accessory you need
What should AirPods do that they don’t already do in order to unleash the potential of discreet ‘all-the-time-anywhere’ computing? I’ve been thinking about this so I thought I’d jot down a few notes. I’m certain there are other ways this could happen. (This is very much a series of notes rather than well-crafted prose, but perhaps there are some useful ideas here).
Books, Music, Radio, Podcasts:
Use Siri to make intelligent recommendations. You should be able to ask for a good book, be provided with a small number of choices and have it read to you, for example.
Location-specific safety information:
For example, a mugging was reported on the next road you are about to cross ten minutes ago, the assailants were described as…
Fall over there’s a check and an emergency call, etc. Medical reminders, such as when to take medicine
An app that provides you with details about people when you meet them, such as name, job title, birthdays – just the kind of info we forget.
…About (say) local history, news reports, places of interest. See this as an extension to Maps: “Hey Siri, tell me about where I am right now.”
Apple Pay Voice:
This is actually very difficult as voice is not that secure a biometric. It is possible this ties in with medical sensors to provide additional biometric information that can be used to identify you.
Siri is already becoming a translation agent. Imagine real-time translation in your ears of what people say.
AirPods could integrate hearing aid features within Assistive Settings
For heartbeat, rate, body temperature etc. Also, gyroscopes for elevation.
Mood-based media provision:
Podcasts to suit your mood. “Play me something to cheer me up,” for example.
In the event of a disease or illness outbreak in an area Airpods will advise you of and monitor for symptoms
“Siri, someone’s had a heart attack, what should I do to help?” to receive step-by-step instructions on what to do.
…user-created radio, playlist sharing, etc. Some people might like to share a playlist with people around them. AirPod users could surf other people’s shared Apple Music playlists, for example. (User experience would be a little sketchy as Bluetooth based).
White is becoming ever so ‘90’s. Time to think different.
Developer tools and VoiceOver improvements:
Many of these solutions already exist in some form on a third party mobile app. Apple will need to figure out how to create an easy and friction-free framework developers can use to take data out of their mobile apps in a way that is relevant and useful to an AirPod user – might this involve use of and deeper investment in VoiceOver, enabling provision of screen-based user experiences via a similar set of instructions blind and partially sighted people already use to control iOS devices. How must this be improved to provide a friction-free interface (and why have those improvements not yet been implemented? Aim should be that it is as easy for a sighted person to use VoiceOver on an iPhone as it might be through AirPods.
Anyone out there got some other ideas for how to make AirPods great?