Minority Report? Apple’s ARKit Gains 3D Real Time Gesture Support

While only an early proof of concept, I can’t help but see today’s ARKit revelation as yet another step toward Minority Report-style future computer user interfaces, a new ARKit technology that makes it possible for us to use our real hands within virtual spaces in order to interact with virtual objects in AR.

Man or superman

ManoMotion this morning announced this support and I think this may become a defining moment in the evolution of Apple’s ARKit.

“Up until now, there has been a very painful limitation to the current state of AR technology – the inability to interact intuitively in depth with augmented objects in 3D space,” said Daniel Carlman, co-founder and CEO of ManoMotion.

“Introducing gesture control to the ARKit, and being the first in the market to show proof of this, for that matter, is a tremendous milestone for us. We’re eager to see how developers create and  potentially redefine interaction in Augmented Reality.”

The integration will be available to developers in the next build of ManoMotion’s SDK, first for Unity iOS, followed by native iOS in subsequent updates.

The demo video for the software shows a simple table top coin-flipping game, but I see no reason it needs to stop there. It’s not hard to imagine the evolution of real world Sure, it’s early days and only a work in progress, but it opens up new possibilities, particularly as the technology improves.

I see it as a user interface innovation that doesn’t require any expensive equipment, other than the iOS device, though I’m sure there’s a long road ahead until the full realization of that. All the same, I see this as a big step toward invisible computers that only exist when you put your AR glasses on.

The tech means:

  • People can use their actual hands in 3D, instead of 2D, to manipulate objects across depth, in AR/MR space
  • Augmented elements can be manipulated with the right or left hand
  • A set of predefined gestures, such as point, push, pinch, swipe and grab, can be accessed and utilized for interactive manipulation of Augmented elements
  • The extent of manipulation can be precisely defined and determined by users

From the company press release:

“With no extra hardware and using a standard 2D camera only (such as a cell phone camera), it recognizes and tracks many of the 27 degrees of freedom (DOF) of motion in a hand. Providing real-time, accurate hand-tracking with depth information, the technology handles dynamic gestures (such as swipes, clicking, tapping, grab and release, etc) with an extremely small footprint on CPUs, memory, and battery consumption.”

More information.

Jonny Evans

Watching Apple since 1999. I don't say what they should do. I say what they might do. They sometimes do.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.