Back in December we rolled out hand tracking as an early feature for consumers and provided a preview SDK to our developer community. On Tuesday, May 19 with Quest software update v17, we’re excited to bring hand tracking out of the Experimental Features section and into general release on Quest while removing the preview tag from the SDK for developers. We're also excited to announce that Hand Tracking support for Unreal (UE4) is now available.
Click here to learn more.
We’ll begin to accept app submissions that include hand tracking on Thursday, May 28. We have a number of great resources to help you get started:
“Considering how early days it is for hand tracking on Quest, the quality and fidelity of the tracking is quite awesome,” says Fast Travel Games Chief Technical Officer Kristoffer Benjaminsson. “Adding support for hand tracking itself was quick and straightforward. Hand tracking for The Curious Tale of the Stolen Pets is a perfect fit! All in all, I think the game is more immersive with hand tracking.”
Aldin Dynamics has created a set of unique features for Waltz of the Wizard to give you a sense of truly wielding magical powers with your own bare hands. Snap your fingers to make objects in your line-of-sight explode, pour giant potions over your hands to enlarge your fingers, interact with a shape-precise water surface or make googly-eyes appear on your hands with hand-puppet poses!
Since hand tracking was added to Experimental Release, we’ve been working to improve the overall experience and further integrate hands within the system:
We updated the tracking architecture to reduce jitter, while improving self presence and input efficiency
Hand tracking users no longer have to switch to using gaze-and-click or controllers to go through OS dialogs like permission and power down
People can now use their hands to draw a room-scale Guardian boundary, meaning hand tracking is compatible across both stationary and room-scale modes
We’ve added a new Hands and Controller section in Settings to offer a more holistic control across input behaviors
We’ve added additional educational features to help people understand how to use hand gestures
The Oculus hand gesture now mimics the controller behavior enabling developers to use the non-dominant hand to invoke the dev menu
We believe that hand tracking has the potential to fundamentally change the way people interact with the virtual world around them, and we’re excited to see the ways our developer community builds a new generation of controller-free experiences for VR!
To get started with Unity or Unreal, check out the following docs:
Be sure to check out our latest updates at
developer.oculus.com
. We can’t wait to get our hands on what you’ll build next!