Is Apple preparing App Store developers for portable AR?
I’ve spent a little time considering a new feature Apple apparently plans in iOS 14 called Clips.
It lets elements of third-party apps to run on your iOS devices without downloading the entire app, and while this sounds like a nice way to let people try apps before using them, I can’t help but also see they make apps more portable.
Why do Clips matter?
You can argue that apps are already portable. They work on portable devices. They work on small devices, such as the Apple Watch. You can access them using Siri. In many different ways they already do work in a highly portable way.
But…
Think about Apple Watch apps: These are refined versions of the software you carry with you on your iPhone that offer a stripped down version of the full app.
This sounds a little like the new Clips feature (not to be confused with the Clips app).
So – what do we know about this feature?
- Developers will be able to provide interactive and dynamic content from their apps even if you haven’t (fully) installed the app.
- Developers control which part of the app must be downloaded by iOS to read that content.
- When you scan a QR code you’ll be able to either (1) access the interactive experience, or (2) download the full app.
- The app experience will be provided on a “floating card” with its own unique UI.
- Apple is testing this feature using Sony’s PS4 Second Screen app, OpenTable, Yelp, DoorDash, and YouTube.
I’m thinking this lends itself to print advertising for apps and the evolution of new experiences in which apps in the virtual and real-world work better together.
But I also think this lends itself to new experiences with apps.
Wearable everything
I think these Clips will be of great interest when exploring virtual AR experiences.
- Imagine you walk down a street wearing your Apple glasses (as and when they turn up).
- You’ll see a shop you are interested in, you’ll ask Siri to switch to AR mode, and information will be overlaid (floating card) on your translucent AR Info display. There you’ll also find QR codes which let you access app experiences.
- You might want information about a place from Open Table, and this will appear on screen, or Yelp – you might (perhaps) get to look inside the location using YouTube.
Or maybe not.
I’m not a product developer. I’m just some guy who watches these things, but I see Clips (not to be confused with Clips) as a transitional technology that gives Apple a way to encourage developers to build the kind of lightweight and mobile app experiences it’s going to want them to deliver as it lays the ground for its AR glasses.
And to do so without needing to tell anyone what it is doing.
What do you think?
Please follow me on Twitter, or join me in the AppleHolic’s bar & grill and Apple Discussions groups on MeWe.
Dear reader, this is just to let you know that as an Amazon Associate I earn from qualifying purchases.