We're excited to announce the release of Inside-Out Body Tracking (IOBT), the first vision-based tracking system for mapping out upper body movements with Quest 3. In addition, we’ve released an exciting update that generates plausible leg motions for all Quest devices, called Generative Legs. IOBT significantly increases tracking accuracy for fitness, combat, and social presence use cases, and Generative Legs makes it easier for developers to create full body characters instead of half-body representations.
Now available for public release as part of v60, these body tracking capabilities will help you build more authentic and realistic movements for a variety of VR and MR experiences. This is a first-of-its-kind release for VR headsets, and we can’t wait for you to start experimenting with it.
Improve Movement Accuracy in Action Games and Fitness Apps
Inside-Out Body Tracking (exclusively available on Meta Quest 3) provides developers and users with a far more accurate and natural body tracking system using the cameras on your headset. You can now accurately reflect elbow, wrist, and torso movements. This unlocks unique value for games and fitness apps, helping users dodge objects, karate chop, or exercise with correct posture.
Generative Legs (compatible with all Meta Quest devices) enables developers to integrate natural leg movements with the help of AI, simply by assessing the position of their upper body (and without the need for additional hardware or sensors). Now, rather than seeing ghost-like half-body avatars floating around, you can see real full-body movement corresponding to what the user is doing, helping increase audience engagement in combat and fitness scenarios.
Enhance Social Presence with More Realistic Body Movements
Using these new Movement SDK capabilities, you will also witness improvements in how your audience—and their avatars—experience everyday actions like standing, walking around, and conversing with others. Now, with the next-generation tracking capabilities of IOBT or the predictive capabilities of Generative Legs, you can easily move to a whiteboard, around a table, or closer to someone you’re having a conversation with—in a far more realistic manner.
Experience These Capabilities in Action
As part of this release, we’ve created an immersive dodge ball app (
Dodge Arcade - available in App Lab) so you can experience how IOBT’s enhanced level of tracking fidelity comes to life on Meta Quest 3. You use body movement to dodge incoming fire balls and to block soccer balls. You can squat, lean, and jump to avoid the balls, and your movement will be reflected both by the character in front of you on the field as well as in replays on the arena replay screen overhead. We know you’ll feel inspired to start experimenting and begin creating your own experiences that will leave your audience feeling more present and engaged.
We Can’t Wait to See What You Build
Movement SDK supports both skeletal and facial retargeting in only a few clicks for both Unreal Metahuman and Unity Mechanim characters. With v60, in addition to the full body samples for Unity, we are also publishing Unreal sample updates including samples for an Unreal Metahuman and Mannequin-rigged characters. Unreal’s standard component for importing Mocap (Live Link) is now supported so that body movements can be easily integrated into Unreal production workflows.
Movement SDK gives you the tools of a motion capture studio right out of the box, and we’re excited to see how you incorporate IOBT and Generative Legs to build exciting and enjoyable VR and mixed reality experiences for your audience.
We’d love to hear what you think. Connect with us on
Facebook and
X, and follow for the latest news, updates, and resources as you build on Meta Quest.
For more information, be sure to check out the documentation for IOBT and Generative Legs (
Unity |
Unreal) and start building today. These capabilities are provided as part of the XR Core package. Our samples for Unreal and Unity can also be found in the documentation links above.