Today we’re excited to announce the launch of
First Hand, an official demo for Hand Tracking built by Meta with our
Interaction SDK.
First Hand lets you experience the magic of interacting with virtual worlds directly with your hands. Use switches, levers, and virtual UIs as you solve puzzles to build robotic gloves—and then experience their superpowers.
We built
First Hand to demonstrate the capabilities and variety of interaction models possible with Hands. The demo was built using
Presence Platform’s Interaction SDK, a library of interaction components, gestures, and tools to build hand and controller interactions.
First Hand is shipping on App Lab and as an
open source project for developers who would like to understand and easily replicate similar interactions in their own games and apps.
We drew inspiration for this demo from
Oculus First Contact, our original experience showcasing Touch controller interactions. Not only was
Oculus First Contact a delightful experience, it also redefined the types of interactions possible with 6DOF controllers. With
First Hand, we hope to similarly inspire developers building groundbreaking Hands content.
The Power of Direct Interactions
Building great Hands-driven experiences requires optimizing across multiple constraints: technical (like tracking capabilities), physiological (like comfort and fatigue), and perceptive (like hand-virtual object interaction representation).
First Hand showcases some of the Hands interactions that we’ve found to be the most magical, robust, and easy to learn but that are also applicable to many categories of content. Notably, we rely heavily on direct interactions. With the advanced direct touch heuristics that come out of the box with Interaction SDK (like touch limiting, which prevents your finger from accidentally traversing buttons), interacting with 2D UIs and buttons in VR feels really natural.
We also showcase several of the grab techniques offered by the SDK. There’s something visceral about directly interacting with the virtual world with your hands, but we’ve found that these interactions also need careful tuning to really work. In the app, you can experiment by interacting with a variety of object classes (small, large, constrained, two-handed) and even crush a rock by squeezing it hard enough.
10 Tips to Get Started with Interaction SDK
Building great Hands experiences with Interaction SDK can be really easy. Here are 10 tips from the team who built First Hand to help you get started:
- In the past, we’ve manually authored hand poses for every item in a game—and it could even take a few iterations before the pose felt natural. With Interaction SDK’s Hand Pose Authoring Tool, we could launch the Editor, reach out and hold the item in a way that felt natural, and record the pose, and it would be usable straight away. Recording poses became so simple that within a minute we could have more poses recorded than we could ever need for an item, and we would then filter those down to a core set of hand poses.
- Interaction SDK supports tracked hands and controller “Hands” out of the box with a structure that allows both input systems the flexibility to play to their strengths. “Hands” use a very modular system to extend the basic functionality with additional features like velocity tracking and pose recognition. If you’re planning to support tracked hands and controllers, make sure that, as you add more features to your tracked hand inputs, you add the features to the controller inputs too.
- Predictably, a great chunk of Interaction SDK’s functionality is driven by “Hands,” and you can expect to find many areas in the SDK where you’ll have to link up “Hands” to your functionality. To make this easier to manage, Interaction SDK provides “HandRef,” a component that acts as a substitute/proxy for a real “Hand” object. This is especially useful in prefabs where swathes of functionality can be wired up to a HandRef within the prefab—then only the HandRef needs to be linked to the real “Hand.”
- Many of Interaction SDK’s features like pose recognition, gesture recognition, and collision detection expose their functionality through the IActiveState interface, which is about as basic as it gets—either returning true or false based on the current state. Multiple ActiveStates can be combined together to trigger any number of gameplay events. The majority of gameplay in First Hand is driven in this way.
- You can build touchable interfaces using Unity UI in exactly the same way you’re used to. Adding a PointableCanvas to the UI Canvas is all that’s needed—everything else just works.
- “Snap Interactors” are great for drop zone interactions and fixing items in place. They’re used extensively in First Hand throughout the experience but especially during the Glove building sequence where “Snap Interactors” trigger its progression.
- For small objects, particularly objects that don't have a natural orientation or grab point, it can feel restrictive to use pre-authored hand poses since the player could hold them however they like. Interaction SDK’s “Touch Grab” feature uses the physics shape of the object to create poses for the hand dynamically. Examples of this in First Hand are the glove parts, which the player can pick up in whatever way feels natural to them.
- Interaction SDK’s “Distance Grab” is easy to configure—essentially a cone extending out from the hand that selects the best candidate item. When setting up an item for “Distance Grab,” it makes sense and saves time to reuse the same hand poses that were set up for regular grab.
- Creating prompts to guide the player through First Hand worked especially well. Using Hands through Link, we were able to create animation clips directly within the Editor. They took very little time to create and were usable immediately.
- There’s plenty going on in First Hand, including interactions, effects, audio, and unique gameplay elements. Managing how all these are triggered was improved by Interaction SDK’s extensive catalog of events—nearly every feature in Interaction SDK provides some way of hooking into its events, making it very easy to trigger custom behavior.