WebXR Hands
Hand tracking enables the use of hands as an alternative input method to controllers for navigating and interacting with UI in Meta Horizon Home and a number of native apps. Browser already supports hand tracking while browsing 2D web pages and now official support for hands integrated into WebXR experiences in the browser is also added. It creates endless possibilities for more natural, controller-free interactions that provide a new-level of presence and social engagement.
The
WebXR Hand Explainer is a good place to get started on implementing hands in WebXR. The API exposes the poses of the 25 skeleton joints (shown in the figure below) in each of the user’s hands, which can be used for gesture recognition or rendering of hand models. Hands will appear as an
XRInputSource alongside controllers, it is noteworthy that the
targetRaySpace for hands is populated with an emulated ray that matches the behaviors of the pointing ray of hands present in Meta Horizon Home, which assumes that the UI is in front of the user, so the target rays will always point in that general direction.
Hands in WebXR on Meta Quest headsets reserves the gesture of palm pinch on both hands for system use. To do a palm pinch on either hand, look at your palm at eye level, then hold your thumb and index finger together until the menu icon (left hand), or the Meta Quest icon (right hand) fills up, then release. Performing a palm pinch on the left hand is equivalent to pressing the menu button on your left controllers, which takes the user out of the WebXR session. The palm pinch gesture is not to be confused with regular pinch gesture, which is implemented in the emulated Gamepad attribute for the hand as a button press. More gestures that are commonly used with hands elsewhere like point-and-pinch, and pinch-and-scroll can be found
here; these can serve as a reference point for implementing custom gestures with hands in WebXR. For more information on best practices with hands, check out
the best practices.
The process of implementing a WebXR hands experience can be divided into three parts:
- Load hand models: Meta contributed a set of standard hand model assets to the public WebXR Input Profiles library under the profile id of “generic-hand”, which is one of the profile ids of the XRInputSource object for hands. But custom hand models can also be used, as long as it complies with the joint hierarchy and placement specified in the WebXR Hand Input Module Specs’ Skeleton Joints Section.
- Get hand tracking data: The joints poses data can be retrieved via the XRFrame interface, there are currently two APIs to use to get the joints’ poses: getJointPose can be used to get joint pose one at a time, which is recommended when the hand model being used has a nested joint hierarchy (which means the joints will have different base spaces); fillPoses can be used to get all joint poses with the same base space at once, which is recommended when a model with a flat joint hierarchy is used (like the one Browser published). If a 3d library is used, the joint poses data might be available via higher-level APIs. In THREE.js, the joints data are kept and updated in the “joints” field of WebXRController class; In Babylon.js, 25 joint meshes that are invisible by default will be created and stored in the “trackedMeshes” field of WebXRHand class.
- Update joint nodes in hand models with hand tracking data: Once you have loaded the hand models and have access to the joints’ tracking data, the rest is as simple as copying the position and orientation of each joint from the hand tracking data over to the respective hand models. The joint names to index mapping is available in the WebXR Hand Input Module Specs’ Skeleton Joints Section.
WebXR hands examples: