We envision a future where your hands are able to move as naturally and intuitively in the metaverse as they do in real life. Over the past few years, we’ve made major strides in making that a (virtual) reality, beginning with the
introduction of hand tracking on Quest in 2019. Since then, we’ve seen many developers build hand mechanics into their immersive experiences.
Today, we’re unlocking major improvements to our Hand Tracking API, Presence Platform’s hand tracking capability—including step-change improvements in tracking continuity, gesture support, movement, and performance. This update also allows fast and overlapping hand movements, enables clapping and other hand-over-hand interactions, and opens up nearly endless object manipulation possibilities.
If you’ve already built hand tracking into your apps, you don’t need to change any API calls. For Native, you can opt-in by adding the following element in the Android Manifest file: <meta-data android:name="com.oculus.handtracking.version" android:value="V2.0"/>. If you’re developing with Unity or Unreal Engine, please check our upcoming documentation for configuration details. If you’re just getting started with hand tracking, see how to set up hand tracking for your app from
Unity,
Unreal, and
Native.
Re-architected with a new computer vision and machine learning approach, our updated hand tracking technology brings you closer to building immersive and natural interactions in VR—without controllers—and delivers key improvements on Quest 2 and future devices, including:
Step-function improvement to tracking continuity
We know it can be frustrating when the system loses track of your hands, and improving this was our number one priority. With this update, we bring a step-function improvement in tracking continuity for greater immersion and a consistent hands experience in everyday use. This means you can spend less time worrying about reliability and spend more time on the unique interactions in your app.
Robust occlusion, faster movement, and new hand-over-hand interactions
We’ve developed a new method of applying deep learning to better understand hand poses when the device’s cameras can’t see the full hand or when the hand is moving quickly. This opens up the use of hands in apps that require robust tracking and more complex gestures including:
Tracking new hand-over-hand interactions like crossing your hands over each other, clapping, or giving high-fives.
More accurate representation of a thumbs-up gesture when the rest of your fingers are blocked from view by your palm.
Better gesture recognition
Our updated hand tracking technology can now consistently recognize the most important gestures you want to use. With improved pinch, grab, and poke recognition, you can expect fewer accidental or missed gestures that take away from the content you’ve created. You’ll also see improvements in apps that implemented any of the gestures from our
Interaction SDK—and any custom gestures you’ve built into your apps. Whether your interaction is picking up a virtual game piece, hitting a punching bag, or playing an instrument, this update brings you closer to the seamless replication of natural hand movements in VR.
What developers are saying
A few developers experienced these improvements in an early preview of the update. They noticed the improved tracking continuity, gesture support, fast movement, and performance—and an in-app experience that feels even more natural and immersive.
“This update to hand tracking is a big step forward in tracking quality and makes playing Cubism with hands feel a lot more stable and consistent. Previously, Cubism’s hand tracking relied on smoothing the hand data input to produce a stable input method. Furthermore, players needed to be taught not to cross their hands since this negatively affected tracking. This is all improved with the latest hand tracking—which is consistent enough for me to turn off hand smoothing by default.” -
Cubism“The update to hand tracking is a very big deal for us. Unplugged is a game that intensively uses hand tracking for an authentic sense of air guitar gameplay. There are a lot of fast hand movements and rapid chord changes among others. Since the beginning, we wanted to make the players feel like they were playing actual air guitar. Even though we managed to achieve very solid and accurate gameplay using the older version of hand tracking, we had to put some limitations on our gameplay. This needed to be done in order to provide a smooth experience that's not interrupted by any issues that such a new technology might has from time to time. With the latest update, hand tracking is so accurate and responsive that we could include all the perks we couldn’t before: fast changes of finger positions, plus an increased -and more realistic number of notes that makes the songs feel way more authentic.” -
Unplugged: Air Guitar"This update to hand tracking is a big step forward for natural and intuitive interactions with hands. One of the biggest challenges when building a hands-first application is the reliability and accuracy of hand tracking provided by the system. Hand Physics Lab was designed to highlight what was possible at the time and to challenge the user to play with the limitations of the technology. With this big improvement, we hope more people will discover what hand tracking has to offer to immersive experiences." -
Hand Physics Lab"The update to hand tracking has definitely improved our ability to deliver a flawless workout experience for Liteboxer Users. Our workouts require a lot of quick punches to be thrown, and it's imperative for hand-tracking to keep up with the rigorous pace. We're really happy with this latest update and excited about the overall direction hand-tracking is headed on the Quest platform." -
LiteboxerMore Presence Platform capabilities coming soon
We’re excited to see the hand tracking experiences you build with Presence Platform. With this hand tracking update and even more capabilities coming soon, we can begin to explore what the metaverse might look like. And we’re committed to helping you create the connected, interoperable worlds that lie ahead.