Give Your App the Upper Hand: What’s New with Hand Tracking in v83

Blog Hero Image
Since debuting on Meta Quest in 2019, hand tracking has evolved into one of the most expressive and accessible ways to interact in VR. It lets you design interactions that go beyond what hardware buttons allow, while giving users a natural input method without the learning curve of a controller.
With each iteration of hand tracking improvements, the line between intention and action gets thinner, actions become more natural, and presence gets stronger. Hand gestures feel targeted and meaningful.
Now with v83, hand tracking continues its evolution to a core part of Meta Horizon OS experiences. This update includes several major improvements:
  • Better performance
  • More reliable high-speed movement
  • Easier locomotion
  • More customizable throwing
  • Smoother onboarding
Our goal with these updates is simple: deliver more freedom for you to design with hands, more reasons for your audience to use them, and more use cases where hands feel like the most natural way to interact. Dive into the complete list of updates below and find key resources to start building with hand tracking (Unity / Unreal) today.

Hands 2.4: High-Speed Interactions That Feel Natural

Historically, fast-paced VR experiences have pushed hand tracking to its limits. In rhythm and fitness apps, punching, swinging, or snapping to a beat are movements where speed and timing matter. Hands 2.4 refines Fast Motion Mode (FMM) so those moments feel more responsive and believable:
  • Faster Hand Acquisition: Hands are detected faster when re-entering view. This reduces the “hand loss” feeling during fast movements.
  • Advanced Motion Upsampling: Smooths out rapid gestures so motion appears continuous instead of choppy while minimizing motion artifacts.
  • Optimized Fast Motion Filters: Helps eliminate perceived latency between hand tracking and controller input during high-energy interactions.
For an additional edge, combine FMM with Wide Motion Mode (WMM), which allows you to track hands and display plausible hand poses even when the hands are outside the headset’s field of view. With this feature, fitness experiences feel more responsive, rhythm games feel coherent, and throwing motions align more closely with the trajectory of the thrown object. Simply put, people feel more connected to the actions they’re taking, rather than it being a distraction from the immersive experience.
If your experience uses FMM, try the update to see the improvements. If you’re new to implementing hand tracking or previously experimented with hands and found high-speed tracking unreliable, this is the time to revisit it.

Interaction SDK: Seamless Locomotion and Customizable Throwing

Locomotion

Movement plays a key role in how people engage with and explore your immersive experiences. For hand-first games, locomotion (movement) has historically been one of the toughest pieces to design. v83 introduces new locomotion interactions directly in the Interaction SDK Samples GitHub, designed for hands from the ground up:
  • Improved teleportation gestures
  • Natural climbing traversal
  • Physics-based movement that reacts to user motion
Now, you don’t have to build unique locomotion systems from scratch that your users will need to relearn. Leverage the Interaction SDK library to quickly integrate interactions that help your audience move throughout your experience, whether that means moving forward, backward, side to side, or up and down.

Throwing

Throwing isn’t new to VR, but it’s also one of the easiest interactions to “break” if you’re adapting your experience from controllers to hands. With v83, we’ve improved the throwing interaction in Interaction SDK with more customization options depending on your use case, along with a new sample scene that demonstrates different throwing styles and object behaviors for different scenarios. These examples include:
  • Darts and precision throws
  • Bowling and weighted arc motion
  • Frisbee-style flight paths
  • Cornhole, football, and basketball
With these samples, you have a faster path towards implementing throwing interactions that fit your experience’s physics, play style, or tone. Try them out in the Interaction SDK samples app and learn how to implement these features in the Interaction SDK documentation for throwing and locomotion.

Get Started

In looking at trends across the ecosystem and talking with developers, hands are shifting from an optional feature into another primary interaction model. By leveraging natural actions that are commonplace in our daily lives, hand inputs lower barriers for new users, expand accessibility, strengthen presence, and open creative opportunities that traditional game controllers can’t match.
With v83, we’re taking another step toward enabling a wider array of experiences where the primary input is the one that feels most natural: our hands. If you’ve been considering designing for hands, now is the right moment to start. Visit the documentation (Unity | Unreal).
Like this content? Check out our release notes, subscribe to our monthly newsletter, and follow us on X and Facebook to be among the first to hear about the latest news, tips, and updates.

Get the latest updates from Meta Horizon.

Get access to highlights and announcements delivered to your inbox.
All
Apps
Design
Games
Hand tracking
Quest
Unity
Unreal
Did you find this page helpful?
Thumbs up icon
Thumbs down icon
Explore more
Spatial Lingo: An Open Source App for AI-Assisted Language Practice with Everyday Objects
Discover Spatial Lingo, a new open source app from Meta that brings mixed reality and AI together to make practicing a new language fun and immersive.
All, Apps, Design, Games, Quest, Unity
VAIL VR (Part Two): A Look Inside AEXLAB’S Community-Driven Live Ops Engine
Learn how AEXLAB turned rebuilt onboarding after a monetization pivot, then sustained VAIL VR through rapid, measured updates and a tight community feedback loop.
Apps, Design, Games, Optimization, Quest
Meta Spatial Simulator: A Better Way to Build with Android for Meta Horizon OS
Test and optimize Android apps for Meta Horizon OS without a headset. Learn how Meta Spatial Simulator streamlines VR development inside Android Studio.
Design, Optimization, Quest

Get the latest updates from Meta Horizon.

Get access to highlights and announcements delivered to your inbox.