Whether it’s shooting at a target, chatting with others at a poker table, or completing a workout, natural and seamless representations of movement are at the core of many great immersive VR experiences. With
Presence Platform’s
Movement SDK, you can expand your app’s immersive repertoire to foster a deeper sense of presence with new body, face, and eye tracking capabilities. These capabilities were designed to help you build more accurate and consistent immersive experiences on our most advanced headset—Meta Quest Pro—but you can also integrate some of these capabilities into experiences for Meta Quest 2.
Body, face, and eye tracking are fueling new possibilities for how people interact with in-app environments—and how they express themselves in VR. You can get started with Movement SDK today by downloading our new sample on
Github that contains four scenes to showcase its flexibility in different use cases. These scenes include:
- “Aura,” our fantastical alien character that demonstrates face tracking capabilities
- A realistic, high-fidelity character demonstrating how body and face tracking can be used in a AAA game experience
- Hip-pinning, which demonstrates how body tracking can interact realistically with virtual objects
- Retargeting, which demonstrates how you can take movements from three-point body tracking and map them onto an arbitrary humanoid character model
See how these scenes represent each component of Movement SDK to inspire how you help people collaborate, connect, and play with deeper feelings of social presence.
Body Tracking
Body tracking uses a combination of headset, controller, or hand inputs to infer people’s upper body poses—effectively raising your ability to accurately represent the nearly endless spectrum of poses that people make when in VR. This model is based on an ML-driven understanding of prior motion and removes unnatural movement due to tracking losses.
The GIF above shows a high-fidelity character with upper body, face, and eye tracking that’s convincingly realistic. Three-point body tracking enables the Body Tracking API to map an abstract series of points to represent joints for an in-app character. This capability lets you build experiences where people’s gestures and interactions are represented with deeper expression and positional accuracy—and it helps prevent breaks that can hamper engagement and immersion. The API stitches the hand skeleton together with the rest of the body and automatically transitions between hands and controllers. The realistic image is driven by bones as represented in the following image and can be used to animate a character of your choice.
An important part of Body Tracking is the ability to map to different character models. Using scripts compatible with the Unity Humanoid character rig, the example takes a character provided by Unity and then maps the body movement returned by our body tracking API to animate this asset. Following the documentation and provided scripts, this is feasible for most humanoid character models. The following GIF shows the asset that we mapped in our retargeting example—with hand tracking enabled and animated using our retargeting scripts.
Body Tracking is available to use today for experiences built for Meta Quest 2 and Meta Quest Pro.
Face Tracking
Face tracking takes advantage of inward-facing cameras to detect expressive facial movements that the Face Tracking API represents through a set of 63 blendshapes. These facial expressions can help you raise more than just eyebrows—you can enable people’s avatars and characters to express the facial subtleties that make our real-world interactions feel intuitive and natural.
Our Face Tracking API provides blendshapes that represent most of the face, and this provides coverage for the facial movements that make up smiling, frowning, surprise, and other expressions. Embodied characters can range from high-quality 3D representations for realistic VR experiences to extremely stylized ones for fantasy and science-fiction environments. The latter is shown with Aura, where our alien character winks at you.
Face Tracking is available to use today for experiences built for Meta Quest Pro.
Eye Tracking
Accurate representations of eye movements in VR can help you tell if your friends are actually looking at you. Eye tracking provides expressive context for the emotion that’s reflected in people’s voices and gestures. This context creates social presence, and it’s key when building an experience that facilitates social connections. Eye tracking enables direct eye contact between people in VR and helps interactions feel natural—and it prevents our brains from working overtime trying to infer ambiguous avatar or character eye movements.
Eye tracking detects eye movement and enables our Eye Tracking API to drive the eye transforms for a user’s embodied character as they look around. It also allows a user’s character representation to make eye contact with others.
Eye Tracking is available to use today for experiences built for Meta Quest Pro.
At Meta, our goal is to support you with the tools you need to develop immersive experiences that keep people physically engaged—and emotionally invested. At
Meta Connect 2022, we shared how adopting Movement SDK and integrating its cutting-edge tracking capabilities can help you cultivate social presence and spark deeper feelings of immersion and connection within your app community.
To get started with Movement SDK, visit our
documentation, and check out the
Presence Platform web page in Developer Center to learn more about our broader suite of capabilities.