Meta Quest 3 is officially launching October 10, 2023, and this week at Meta Connect we announced a series of upgrades to the Presence Platform capabilities that power inputs, interactions, and movement capture on our next-generation headset. These upgrades include high-fidelity upper-body tracking, more natural and responsive interactions like microgestures, realistic legs, and more.
We’ve been working to help you enhance immersion with tools and capabilities that can make your experiences feel more real—specifically by closing the gap between the variety and accuracy of interactions, inputs, and movements available in-headset and those experienced in the physical world. Using Movement SDK, Interaction SDK, Voice SDK, Haptics Studio, and Haptics SDK, you can help your audience feel more present and engaged with your experience, not to mention level up the fun factor.
Enhance Casual Gaming & Build Social Presence Using Legs
For years, generating accurate leg motions in VR has been a difficult problem to solve. We are solving that problem without any additional hardware or added user friction. Using information from your upper body to generate what your lower body might be doing, you can soon deliver much more natural movements compared to IK-based approaches.
For the first time, you can use generative legs to integrate natural leg movements for your app’s users using only the position of their upper body. This has a direct impact on social experiences and fitness apps by accurately representing common leg movements like standing, walking, jumping, and more. It also enhances casual gaming experiences, grounding users in immersion as they move around, explore, and interact. You can see how legs can enrich gameplay in the new showcase app Dodge Arcade, previewed at Meta Connect.
The showcase app Dodge Arcade (shared at Meta Connect) uses Generative Legs and enhanced body tracking to enrich gameplay.
Because these are generated legs and not tracked legs, there are some nuanced movements that may not be able to be captured, such as crossing your legs—however, we believe this approach can be successfully applied to a variety of social and gaming experiences on Quest 3, Quest Pro, and even Quest 2, giving your audience an enhanced feeling of presence through experiencing their leg movements represented more accurately in VR.
Deliver High-fidelity, Natural Movements with Inside-Out Body Tracking
With Meta Quest 3, body tracking is taking a dramatic leap forward with inside-out body tracking (IOBT), an approach that uses cameras on the headset to track the upper body. This produces a far more accurate and natural tracking system than what’s available on Quest 2 and Quest Pro using hands and/or controller and headset movements to infer the body poses of the user. This means that, for the first time, you can distinguish complex and nuanced animations in your experiences, including animations for elbow and wrist movements and leaning side-to-side.
By improving the accuracy of upper body tracking, IOBT can have an immediate impact, helping your app’s users jump, dance, or walk around more naturally. IOBT also enhances existing features like hand tracking by being able to infer arm and hand positions—even when they’re out of view.
Movement SDK gives you the tools of a motion capture studio right out of the box, and IOBT on Quest 3 delivers the ability to make users’ baseball swings look fluid, karate chops look precise, and waving hello feel even more friendly.
Increase Immersion with Improved Hand Tracking Responsiveness
In August we launched
Hand Tracking v2.2 with v56, a major upgrade which delivers between
40% and 75% improvement in latency, significantly reducing the user experience gap between controllers and hands. We also launched Fast Motion Mode, an evolution of high-frequency tracking that’s designed to make your fast-paced games even more responsive. You can try out the
Move Fast demo app to see FMM in action (also available as
open source), and refer to the
Unity,
Unreal, and
Native documentation to enable Hand Tracking and FMM on your app today.
On Meta Quest 3, you’ll soon also be able to use Wide Motion Mode to track hands even when they're outside the headset’s field of view. This mode greatly improves social presence and wide motion interactions at a flip of a switch.
Unlock a More Seamless Experience With Intuitive and Natural Interactions
Interaction SDK makes it easy to add controller- and hand-based interactions to your app experience, including pokes, grabs, and many more that you can pick and choose from. Beginning in 2024, we’re going to start adding support for brand-new microgestures like thumb swipes, brief finger taps, and more so you can enable easy and subtle movements for things like locomotion or scrolling across a UI panel. This can also reduce fatigue from repetitive interactions and create a simpler way for people to engage with your experience.
At Meta Connect, we showed how microgestures can power locomotion.
Speaking of engaging naturally in-headset, with v57 we’re also adding support for Direct Touch to all interactions made in the Horizon Home environment. We believe that Direct Touch can make 2D interactions much more enjoyable in apps, and to make it easy to adopt, we’ve also made all Direct Touch primitives available in Interaction SDK as an out-of-the-box feature.
At Meta Connect, we showed Direct Touch being used in the Horizon Home environment.
Interaction SDK’s library enables you to build unique, engaging gameplay, while giving your audience flexibility to interact with app environments in different ways. We’re excited to offer Building Block support for Interaction SDK with Unity starting with v57. Building Blocks are preconfigured prefabs for some of Presence Platform’s most common use cases and can make your life easier if you’re new to Interaction SDK.
Combine the Power of Controllers and Hands
To support improved immersion on Meta Quest 3 and beyond, we’ve been working on new features to help you unlock hand- and controller-based experiences that feel unique to your vision and keep people engaged. We recently introduced Multimodal with SDK v56, a new experimental feature that provides simultaneous hands and controller tracking. That means you can develop gameplay in which people can use one hand for input while using a controller for input with their other hand. It also allows your audience to instantly transition between the two if they want a change of pace.
With Capsense-driven hands, you’re given natural hand poses that conform to Meta’s controller models. This lets you integrate a hand visualization on top of, or instead of, users’ controllers.
Both of these features will be released in the coming months, but you can start exploring their use cases in your app today by visiting the documentation pages for
Multimodal and
Capsense Hands.
Activate Voice Inputs for Engaging Gameplay
Inputs include much more than just hands and controllers, and incorporating people’s voice as an input method for interacting with your app environment can enable more immersive gameplay and deliver a more
accessible experience for a diverse audience.
Voice SDK features like Voice Command and Dictation give people an easy way to interact with characters and objects, while Text-to-Speech (TTS) can help bring the experience alive by giving voices to in-game characters.
We’re adding 15 game-ready TTS voices to double our library, and soon you’ll be able to use Voice Cloning to create brand-new custom TTS voices for your experience with only 20 minutes of voice recordings. Check out how Voice Cloning gave a unique voice to Skully from Aldin’s
Waltz of the Wizard.
If you’re a part of the First Access program, reach out to your Meta representative to learn more about our plan and how you can give Voice Cloning a try. Check out more of what developers have been building with Voice SDK below.
To get started enabling voice-driven gameplay and interactions, visit the documentation (
Unity |
Unreal), and don’t forget to reference
Voice SDK best practices for creating a more engaging experience.
Generate Buzz with Haptics and Touch Plus Controllers
Quest 3’s Touch Plus controllers let you activate a wide variety of vibrations and advanced haptics so your audience can feel the fun and get feedback to keep them engaged. With the help of Meta Haptics Studio and Haptics SDK, you can design, audition and integrate high-quality haptics into your apps that are backwards-compatible across all Meta Quest devices.
Meta Haptics Studio is a standalone desktop app for PC and Mac that creates haptics effects from existing audio, and it includes a built-in library and tutorial to help you get started. On the flip side, Haptics SDK enables you to integrate your haptics and effects into your app, offering a media-like API to trigger and control haptics in the same way you’d control audio. See the documentation to get started with
Meta Haptics Studio and Meta Haptics SDK (
Unity |
Native |
Unreal)
We’re excited to offer a host of new features and improvements to existing capabilities that can help you level up peoples’ sense of presence in your apps, and we can’t wait to see how you use these tools to drive exciting and enjoyable VR and mixed reality experiences. For the latest updates and news for developers building on Meta Quest, follow us on
Facebook and
X.