During Oculus Connect 6, we first announced our efforts to enable
hand tracking for Oculus Quest and
while we shared an initial release date of early 2020, we’ve been working hard to get this new feature in your hands as soon as possible. Today we’re happy to announce that hand tracking will be
available as an early feature for consumers with the latest v12 Quest software update this week, and for developers in an SDK launching next week. For Quest owners, this initial release will offer the ability to navigate with your hands in Quest Home and system menus, as well as some first party applications like Oculus TV and Browser. And with the new SDK, you, our developer community can now begin building a new generation of controller-free experiences for VR.
Hand tracking delivers a new sense of presence, enhances social engagement and delivers more intuitive interactions in VR. It adds a new input option for developers that complements Touch controllers, which continue to be ideal for many types of gaming experiences requiring high precision, haptic feedback, and the ability to leverage physical inputs. We believe the future of VR is brighter with this new option that broadens the possibilities for what can be built, and we look forward to seeing what you can imagine in 2020 and beyond.
How to get started + What’s next
Starting next week, you can begin building hand tracking-enabled Quest applications using our updated APIs like VrAPI which will now include three hand pose functions: vrapi_GetHandSkeleton, vrapi_GetHandPose, and vrapi_GetHandMesh. Technical documentation for these updated APIs, Native SDK and Unity Integration will also be available next week, while UE4 compatibility will be available in the first half of 2020. A new Unity sample will also be available, showing how to use OVRInput to create hand-tracking experiences. See below for a snapshot of this new sample:
As you work to finalize your applications leveraging this new feature, we’ll begin enabling you to publish your game/application on the Oculus Store in January 2020, using the same existing submission and intake process for Quest apps.
Hand tracking technology: How it works
To make this all possible, our computer vision team developed a new method of applying deep learning to understand the position of your fingers using just the monochrome cameras already built into Quest. No active depth-sensing cameras, additional sensors or extra processors are required. Instead, we use deep learning combines with model-based tracking to estimate the location and size of a user’s hands, then reconstructing the “pose” of a user’s hands and fingers in a 3D model. And this is all done on a mobile processor, without compromising resources we’ve dedicated to user applications. Check out the following OC6 presentation by our product + engineering team for a deep dive into the tech that drives hand tracking:
If you’re interested in learning more about designing for hand-tracked games/applications, be sure to check out the following presentation by our friends at Talespin and Magnopus as they provide a case study from their hand tracking-enabled applications, American Farmers Insurance and Elixir, respectively, presented on the show floor at OC6.
We look forward to sharing updates to this latest SDK and hand tracking-enabled APIs throughout 2020. Be sure to keep an eye on the developer blog for the v12 release notes and Oculus Developer Twitter + Facebook pages for more releases, learnings, and insights.
- The Oculus Team