How Hand Interactions Are Opening New Possibilities for VR Developers

Blog Hero Image
When hand tracking first launched on the original Meta Quest headset in 2019, it was a glimpse into the future. In the time since, it's steadily evolved from an experimental feature into an input system powering real interactions like toggling switches, squeezing items, and scrolling menus. Players today can do what was impossible just a few years ago: snap their fingers to make objects vanish, use hand gestures to cast spells, or conduct entire orchestras using only their hands.
A few developers in particular — Aldin Dynamics (Waltz of the Wizard), Double Jack (Maestro), and Hengxin Shambala (Drakheir) — are proving that when you design with hands first, you can create interactions that controllers simply can’t match. Using this lens, let’s take a closer look at how hands have evolved, what other developers have learned, and why now is the right time for you to build with it.

Why Hands?

In the real world, you don’t think about how to use your hands… you just use your hands. That same instinct carries into VR, though it lands differently depending on the player. For newcomers, using their hands feels intuitive and welcoming. For seasoned VR users, it deepens immersion by making every movement feel more direct.
With no tutorials or button mappings to remember, interaction becomes as simple as a reach, a pinch, or a wave. This ease of use makes hand interactions especially powerful in experiences where your own movement is the mechanic.

Benefits

The developers we spoke with highlighted these four advantages:
  • Accessibility. Players don’t need gaming experience to understand reaching, pinching, or pointing.
  • Direct interactions. A snap of the finger to make an object vanish, a poke that makes a companion react, or a gesture that summons fire.
  • Design freedom. Props become part of play. In Maestro, chopsticks and pens doubled as batons, which helped fuel viral clips that reached way beyond the VR community.
  • Presence. Watching your own fingers curl and flex in VR makes the experience feel less like you're using a system and more like you’re actually there.

Limitations and Tradeoffs

However, hand interactions aren’t a replacement for every input. Developers who adopt it should consider the following:
  • No haptics. Vibrations need to be replaced with visual or audio cues.
  • Tracking gaps. Fast motions, occlusion, or lighting can break recognition, though ongoing updates continue to make tracking more reliable with significant improvements rolling out soon.
  • Precision limits. Hand input isn’t built for FPS-level targeting.
  • Learning curve. Players perform gestures differently, so systems must be forgiving.

Maestro

At first, conducting an orchestra in VR seemed like a natural fit for Double Jack’s Maestro. As the developers put it, it was “a no-brainer pitch for a VR game” because conducting “hinges on the ability of the performer to move their hands in harmony with the music.”
Maestro started off as an experiment in hand tracking capabilities, and became a hands-first conducting game.
But early prototypes with controllers broke the illusion. The controllers’ weight and grip made even subtle movements feel stiff and labored, which made the experience feel more like a workout and less like artistry.
Implementing hands changed everything. Motions became lighter, more expressive, and closer to real life conducting. To handle challenges like tracking loss, the team built features like dead-reckoning (estimating baton movement when hands disappeared) and persistent on-screen hints.
Holding a real-world prop in lieu of a conductor’s baton… added that physical, kinetic feel to the experience, bringing users that much closer to the real sensations.

Double Jack, Maestro developer

Then something unexpected happened. Because the system tracked hands while holding objects, players started conducting with whatever was nearby: chopsticks, pens, and even cucumbers (yes, cucumbers). What started as a technical quirk quickly became a viral hook that drew attention online from people who’d never tried VR before.
Players transformed everyday objects into Maestros’ conductor batons.

Drakheir

For Hengxin Shambala, building their first-person action RPG Drakheir meant skipping conventional mechanics. Inspired by hand seal movements popular in Japanese animation, the team designed a spellcasting system that used combinations of gestures to unleash powerful attacks.
Drakheir’s hand seals combine gestures into vibrant spellcasting sequences.
The challenge was consistency. One player’s triangle seal was sharp and small, while another’s was wide and slow. Early tests caused misfires and frustration. The team solved this by broadening recognition ranges and removing gestures that overlapped. Instead of demanding perfection from each player input, the team reconfigured the system to capture the essence of each hand seal. The end result was fast and expressive spellcasting that celebrated variation rather than punishing it.
Everyone had different understandings of traditional actions and had different amplitudes of actions… we chose to expand the tolerance of gesture recognition in each game so that it could be triggered with nearly 70% of standard gesture actions.

Hengxin Shambala, Drakheir developer

Waltz of the Wizard

Aldin Dynamics built Waltz of the Wizard on the idea that VR should feel as natural as real life. Before hand tracking was available on Quest, the team filled the game’s wizard tower environment with bottles to break, magic potions to drink, and characters to poke.
When hand tracking finally did arrive in 2019, everything clicked. Players didn’t need instructions or complicated onboarding. They could simply reach, grab, or poke to their heart’s desire. Finger snapping quickly emerged as the game’s signature interaction: look at an object, snap your fingers, and watch it vanish. It was instantly understood, instantly delightful, and became one of the strongest examples of the potential of hand interactions.
Finger snapping became Waltz of the Wizard’s signature mechanic.
At the time, it was unusual for a VR title to offer such a broad set of hand interactions. Aldin began featuring it in their marketing, promoting the wizard’s tower as one of the most extensive showcases of natural interaction available. Over time, hands moved from being one feature among many to a centerpiece of how they present the game.
The most powerful thing about handtracking is the ability to put anyone in the experience and have them interact entirely naturally. No explanations needed, because we all know how to exist in a reality.

Aldin Dynamics, Waltz of the Wizard developer

Best Practices

A few lessons stand out from the above case studies.
  • Start with hands. It’s easier to add controllers later than retrofit hand input. Maestro succeeded because it was designed for expressive conducting from day one.
  • Use the SDKs: The Interaction SDK and microgestures OpenXR extension remove many of the barriers to adoption with proven interactions like grab, poke, thumb swipes, and teleport locomotion.
  • Design for clear feedback. Without haptics, players need other signals. In Waltz of the Wizard, a snap comes with a crackle of sound and a burst of light.
  • Playtest extensively. Gestures have “accents.” In Drakheir, testing showed just how differently players formed the same seals, which ultimately led to a much more forgiving system.
  • Keep it forgiving. Simplify gestures and expand recognition ranges so players feel confident instead of frustrated.
  • Match the mechanic. Hands shine in interactions built on natural motion, like spellcasting, conducting, and poking. For fast-twitch shooting or precision work, controllers make more sense.

Now’s the Time to Start Thinking about Hand Tracking

Hand tracking has improved with every release since its debut. Hand Tracking 2.3 reduced jitter and fingertip drift to make throws and tools feel more natural. Microgestures added thumb swipes and taps for quick navigation. Multimodal input lets one hand stay free while the other uses a controller for accuracy. And the new Interaction SDK Quick Actions menu in Unity makes it easier than ever to implement hand interactions by simply right-clicking on the objects you want to make interactable.
With hand-tracking support now shipped as a core feature, developers have another creative tool in their arsenal to stand out in the Meta Horizon Store.

Getting Started with Hand Tracking

It’s easier than ever to test hands in your current project. Try these first steps, prototype, and see whether hand input fits into your experience.
  • Check out the Hands Design Guidelines.
  • Add a manifest flag to enable Hand Tracking.
  • Use the Interaction SDK (Unity, Unreal) for grab, poke, ray, and locomotion.
  • Add the Microgestures (Unity, Unreal) extension for thumb swipes and taps.
  • Explore sample projects like First Hand (App, GitHub) and Move Fast (App, GitHub).

Proven Design Patterns

Additionally, if you are considering hands, here are some patterns worth exploring:
  • Pinch = select. A natural stand-in for clicks across UIs and menus.
  • Tap/poke = activate. Use direct touch or a thumb tap microgesture for quick selections.
  • Thumb swipes = navigation. D-pad-style input for turning, stepping, and teleporting.
  • Broad gestures = actions. Large movements map well to powers, tools, or other expressive actions.
Pro Tip: Use the SDKs. The Interaction SDK and Microgestures OpenXR extension offer proven interactions like grab, poke, and ray input, plus locomotion features such as teleport and thumb-swipe turning. Together, they remove key barriers to adopting hands so you can focus on creating great experiences.

Looking Ahead

Meta’s roadmap is closing the gap between hands and controllers. Faster acquisition, lower latency, and an expanded library of interactions are making hand tracking experiences better with each update.
For developers, the opportunity is clear: build projects that stand out, spread awareness through clips and word-of-mouth, and remind players how VR is constantly delivering fresh experiences. Simply put, designing with hands today means you’ll be ready for what’s coming tomorrow.
Keep up with the latest developer tips and insights by checking out the Horizon OS release notes, subscribing to our monthly developer newsletter, and following us on X and Facebook.

Get the latest updates from Meta Horizon.

Get access to highlights and announcements delivered to your inbox.
All
Apps
Design
Games
Hand tracking
Unity
Unreal
Did you find this page helpful?
Thumbs up icon
Thumbs down icon
Explore more
Make the Most of Holiday Trends
Discover holiday trends that boost engagement and spending on Meta Horizon. See when users tune in, learn how to get discovered, and explore strategies to make your worlds and apps stand out this season.
All, Apps, Design, Games, Quest
Matthiaos: Pioneering Change in Worlds Through Passion and Community
When he’s not building award-winning experiences, Worlds creator Matthiaos is helping lead the community shaping Meta Horizon’s future.
All, Design, Games, Mobile, Quest
Shovel Up! Goes Remixable: Best Practices for World Builders
Shovel Up! is setting the stage for a new era of community-powered worlds by officially becoming remixable and open source. Discover best practices to make your worlds easier to remix, maintain, and help the community collaborate and innovate even faster.
All, Apps, Design, Games, Horizon Worlds, Mobile, Quest

Get the latest updates from Meta Horizon.

Get access to highlights and announcements delivered to your inbox.