AR Gesture

AR Gesture Control Arrives

ManoMotion has announced that it has successfully integrated its 3D real-time hand gesture recognition technology with Apple's ARKit, and soon Google’s ARCore for advanced Android phones. ManoMotion’s technology lets people use their actual hands in VR/AR/MR for interaction with virtual objects. ManoMotion's SDK in now available to developers on its website. Over six hundred downloaded the SDK (software developer kit) the first day it was available.
ManoMotion
MonoMotion of Sweeden announces 3D real-time hand gesture recognition technology for Apple's ARKit, and soon Google’s ARCore for advanced Android phones.
ManoMotion is a computer vision-based software company founded in 2015. Based in Stockholm, Sweden, and with a sales and marketing office in Palo Alto, California, ManoMotion’s long-term vision is to bring unparalleled intuition to human-machine interactions using gesture technology. They have developed a core technology framework to achieve precise hand tracking and gesture recognition in 3D-space simply via a 2-D camera - available on any smart device. They offer this solution across multiple platforms in Virtual Reality, Augmented/Mixed Reality or any environment that requires natural and intuitive interaction.
MonoMotion
This is a real breakthough. Now AR camera apps can be controlled without touching the screen.

ManoMotion's 3D real-time gesture recognition technology lets people use their actual hands in VR/AR/MR for interaction with virtual objects. With no extra hardware and using a standard smart phone camera, ManoMotion's software recognizes and tracks many of the 27 degrees of freedom (DOF) of motion in a hand. This provides real-time, accurate hand-tracking with depth information. The technology handles dynamic gestures (such as swipes, clicking, tapping, grab and release, etc) which will enable huge changes in the way apps are controlled through the camera. This is a real breakthrough.
In my opinion, hand integration is the key to creating a fully immersive AR experience on a smart phone. How does a user interface with objects in a virtual world, seen through the small screen of a phone? Out of necessity VR now solves the problem with controllers, which are often replaced by avatar hands, engendering a mild suspension of disbelief while engaged in the simulation. Leap Motion has pioneered a technology to track hands and place their tracked avatars into the virtual world. This will be a key feature in all advanced VR headsets next year. Now, thanks to ManoMotion, your real hands interact intuitively with digital objects in AR, too. Now users will be able to capture Pikachu with their fingers!
ManoMotion
Daniel Carlman, CEO of ManoMotion, which is changing the way AR works.
“Our hardware works with any sensor both RGB and Depth sensors. We want to be flexible to support our customer needs. Our tech works well in Mobile VR and AR. We actually started out in AR you can check out some of our examples of being able to see your hands in VR makes the feeling of presence much stronger.” Said Daniel Carlman, co-founder, and CEO of ManoMotion.
Until today, AR designers have solved the problem of hand integration by treating digital objects like characters in a video game that are manipulated with a game controller. With ManoMotion’s SDK, this is no longer necessary: point at the monster with your finger (your real finger!), pull the virtual trigger and destroy it.
“Up until now, there has been a very painful limitation to the current state of AR technology - the inability to interact intuitively in depth with augmented objects in 3D space,” Carlman continued. “Introducing gesture control to the ARKit, and being the first in the market to show proof of this, for that matter, is a tremendous milestone for us. We’re eager to see how developers create and potentially redefine interaction in Augmented Reality.”The new integration will allow developers to create applications and content on the ARKit with “hand presence”, in which people can use their actual hands in 3D, instead of the limited 2D screen, to manipulate objects in AR/MR space. Augmented elements can be manipulated with the right hand or the left hand (the other is holding the phone!).
A set of predefined gestures, such as point, push, pinch, swipe and grab, can be accessed and utilized for interactive manipulation of Augmented elements
“Right now we are focused on getting user feedback and input on what our developer community would like to see next,” said Carlman. Meanwhile, the rest of us in the greater VR community will have to wait to see what the developers do with what Carlman and his collaborators have given then.

Comments

Popular posts from this blog

VR

Pete’s

Google Tez UPI-Enabled Digital Payment App Launched in India