Introducing ‘First Hand,’ Our Official Hand Tracking Demo Built With Presence Platform’s Interaction SDK

Blog Hero Image
Today we’re excited to announce the launch of First Hand, an official demo for Hand Tracking built by Meta with our Interaction SDK. First Hand lets you experience the magic of interacting with virtual worlds directly with your hands. Use switches, levers, and virtual UIs as you solve puzzles to build robotic gloves—and then experience their superpowers.
We built First Hand to demonstrate the capabilities and variety of interaction models possible with Hands. The demo was built using Presence Platform’s Interaction SDK, a library of interaction components, gestures, and tools to build hand and controller interactions. First Hand is shipping on App Lab and as an open source project for developers who would like to understand and easily replicate similar interactions in their own games and apps.
We drew inspiration for this demo from Oculus First Contact, our original experience showcasing Touch controller interactions. Not only was Oculus First Contact a delightful experience, it also redefined the types of interactions possible with 6DOF controllers. With First Hand, we hope to similarly inspire developers building groundbreaking Hands content.

The Power of Direct Interactions

Building great Hands-driven experiences requires optimizing across multiple constraints: technical (like tracking capabilities), physiological (like comfort and fatigue), and perceptive (like hand-virtual object interaction representation).
First Hand showcases some of the Hands interactions that we’ve found to be the most magical, robust, and easy to learn but that are also applicable to many categories of content. Notably, we rely heavily on direct interactions. With the advanced direct touch heuristics that come out of the box with Interaction SDK (like touch limiting, which prevents your finger from accidentally traversing buttons), interacting with 2D UIs and buttons in VR feels really natural.
We also showcase several of the grab techniques offered by the SDK. There’s something visceral about directly interacting with the virtual world with your hands, but we’ve found that these interactions also need careful tuning to really work. In the app, you can experiment by interacting with a variety of object classes (small, large, constrained, two-handed) and even crush a rock by squeezing it hard enough.

10 Tips to Get Started with Interaction SDK

Building great Hands experiences with Interaction SDK can be really easy. Here are 10 tips from the team who built First Hand to help you get started:
  1. In the past, we’ve manually authored hand poses for every item in a game—and it could even take a few iterations before the pose felt natural. With Interaction SDK’s Hand Pose Authoring Tool, we could launch the Editor, reach out and hold the item in a way that felt natural, and record the pose, and it would be usable straight away. Recording poses became so simple that within a minute we could have more poses recorded than we could ever need for an item, and we would then filter those down to a core set of hand poses.
  2. Interaction SDK supports tracked hands and controller “Hands” out of the box with a structure that allows both input systems the flexibility to play to their strengths. “Hands” use a very modular system to extend the basic functionality with additional features like velocity tracking and pose recognition. If you’re planning to support tracked hands and controllers, make sure that, as you add more features to your tracked hand inputs, you add the features to the controller inputs too.
  3. Predictably, a great chunk of Interaction SDK’s functionality is driven by “Hands,” and you can expect to find many areas in the SDK where you’ll have to link up “Hands” to your functionality. To make this easier to manage, Interaction SDK provides “HandRef,” a component that acts as a substitute/proxy for a real “Hand” object. This is especially useful in prefabs where swathes of functionality can be wired up to a HandRef within the prefab—then only the HandRef needs to be linked to the real “Hand.”
  4. Many of Interaction SDK’s features like pose recognition, gesture recognition, and collision detection expose their functionality through the IActiveState interface, which is about as basic as it gets—either returning true or false based on the current state. Multiple ActiveStates can be combined together to trigger any number of gameplay events. The majority of gameplay in First Hand is driven in this way.
  5. You can build touchable interfaces using Unity UI in exactly the same way you’re used to. Adding a PointableCanvas to the UI Canvas is all that’s needed—everything else just works.
  6. “Snap Interactors” are great for drop zone interactions and fixing items in place. They’re used extensively in First Hand throughout the experience but especially during the Glove building sequence where “Snap Interactors” trigger its progression.
  7. For small objects, particularly objects that don't have a natural orientation or grab point, it can feel restrictive to use pre-authored hand poses since the player could hold them however they like. Interaction SDK’s “Touch Grab” feature uses the physics shape of the object to create poses for the hand dynamically. Examples of this in First Hand are the glove parts, which the player can pick up in whatever way feels natural to them.
  8. Interaction SDK’s “Distance Grab” is easy to configure—essentially a cone extending out from the hand that selects the best candidate item. When setting up an item for “Distance Grab,” it makes sense and saves time to reuse the same hand poses that were set up for regular grab.
  9. Creating prompts to guide the player through First Hand worked especially well. Using Hands through Link, we were able to create animation clips directly within the Editor. They took very little time to create and were usable immediately.
  10. There’s plenty going on in First Hand, including interactions, effects, audio, and unique gameplay elements. Managing how all these are triggered was improved by Interaction SDK’s extensive catalog of events—nearly every feature in Interaction SDK provides some way of hooking into its events, making it very easy to trigger custom behavior.
Hand tracking
Presence platform
Unity
Did you find this page helpful?
Thumbs up icon
Thumbs down icon
Explore more
Meta Horizon Start Developer Competition 2025: More Ways to Build. More Ways to Win.
Enter the Meta Horizon Start Developer Competition 2025 to take home your part of the $1.5 million prize pool. Learn new skills, experiment with the latest capabilities, and showcase your work to a global audience.
All, Design, Games, Marketing, Unity, Unreal
How to Get Help, File Bugs, and Track Fixes when Building with Meta Horizon
Hitting roadblocks on your build? Remove the friction with a new feedback and support system connecting you to answers from peers and Meta engineers.
Debugging, Optimization, Quest, Unity, Unreal
How Hand Interactions Are Opening New Possibilities for VR Developers
Learn how the developers of Waltz of the Wizard, Drakheir, and Maestro are using hand interactions to make VR experiences more expressive and memorable.
All, Apps, Design, Games, Hand tracking, Unity, Unreal

Get the latest updates from Meta Horizon.

Get access to highlights and announcements delivered to your inbox.