Meta Quest 3 is officially launching on October 10, with pre-orders
available now so you can start building with the newest technology on launch day. With our next-generation headset, we’re unlocking the magic of mixed reality (MR), which lets you seamlessly blend virtual content with the physical world while letting users see and interact naturally and intuitively in rich color.
Our newest MR capabilities from
Presence Platform enable realism by adding the ability to dynamically detect and model the user’s physical space. With MR, you have even more possibilities to expand how your audience interacts, engages, and moves with your app—while also allowing them to stay present in their physical environment.
Keep reading to discover the business benefits of integrating MR features, new capabilities like the Depth API and Mesh API, and new tools to accelerate MR development with popular game engines.
Create an Engaging and Enjoyable Experience for a Wide Audience
The significantly-improved Passthrough experience on Quest 3 delivers 10x more pixels than Quest 2 and a new AI-powered depth engine that takes full advantage of the hardware capabilities. This gives you the tools to create delightful experiences in which people can see the world around them with greater clarity and realism. Developers are already seeing a substantial impact on user engagement through the use of MR capabilities such as Passthrough—for example, the mindfulness app
TRIPP boosted their daily active users (DAU), with their seven-day average increasing by 31% after incorporating
Passthrough-based features.
Multimodal and Capsense-driven hands
Full-color Passthrough can also help you create an enjoyable experience for audiences using your app for the first time. “Meta Quest’s Presence Platform, especially hand tracking and Passthrough, has been a great fit for
Cubism's simple game format,” says Thomas Van Bouwel, creator of
Cubism. “I’m drawn to MR because it seems to reduce the onboarding friction for new players since it lets them stay connected to their environment.”
Starting later this year, people will also be able to seamlessly transition from Passthrough in Home into your MR app without jarring hiccups like black screens or loading screens.
Enrich Your Apps with AI-powered Mixed Reality
Seeing the physical world in color and blending that view with virtual content is just the beginning of what’s possible with MR. The tools that power MR experiences are available through
Presence Platform, a collection of AI- and machine learning-powered capabilities that enable you to enrich your experiences, improve presence, and give people more freedom to play and explore their own way. Before we dive deeper into brand-new updates, here’s an overview of the core capabilities that you can use to power MR:
Passthrough: Deliver a real-time and perceptually comfortable 3D visualization of the physical world inside Meta Quest headsets (
Unity |
Unreal |
OpenXR |
WebXR)
Scene: Build environment-aware experiences that have rich interactions with the user’s physical surroundings (
Unity |
Unreal |
OpenXR |
WebXR)
Shared Spatial Anchors: Build local multiplayer experiences by creating a shared world-locked frame of reference for multiple users (
Unity |
Unreal |
OpenXR)
These MR capabilities enable the world around you to become a canvas to unlock your creative potential across a variety of use cases, and with MR on Quest 3, people to see your world-blending creation come to life with greater detail thanks to nearly 15% wider field of view compared to Quest 2, along with increased pixel density to 25.5 PPD and 1,218 PPI.
Apps are already beginning to
embrace MR to deliver flexible—and fun—gameplay. For example,
Rube Goldberg Workshop lets people build numerous iterations of machines using physical objects and surfaces as part of their creations.
“We had a specific goal to not just utilize the new tools like Scene Capture and hand tracking, but really develop around those capabilities,” says Stephen Scholz, Producer at Free Range Games. “Additionally, the nature of our game requires a very precise location of objects in order for them to interact consistently each time a player launches the game. Using Spatial Anchors ensures continuity for players by enabling us to anchor objects in the same place between gaming sessions.”
Presence Platform Hackathon winner SUBMERSED uses Passthrough and Spatial Anchors to create a survival game.
These capabilities were designed to be flexible according to your development goals, and they can be used independently or together with other powerful features powered by Presence Platform, like tracking body movement with
Movement SDK,
Hand Tracking, and more. Now that you’ve seen the capabilities powering MR in action, read further to take a look at some new updates coming to Presence Platform.
Increase the Believability and Realism of Mixed Reality with Brand-New APIs
At Meta Connect, we announced Depth API, which provides real-time, per-eye, per-frame, environment depth estimates from the headset’s point of view.
With Depth API, you can easily implement dynamic occlusions. That’s essential for rendering believable visual interactions between virtual and physical content in your apps—enabling a more coherent sense of reality for your users. But you can go a lot further. We’ve designed the Depth API so that you can leverage your own algorithms and techniques from traditional real-time 3D apps. This lets you render depth-based effects like fog or use depth information for gameplay.
We’re making the Depth API available on Unity, Unreal, and OpenXR as an experimental release in v57 for Quest 3.
Demo showing Depth API from Meta Connect
For even more believability of your MR experiences, we’re introducing Mesh API with v57 on Unity, Unreal, and OpenXR. Mesh API enables realistic interactions between virtual and physical objects, such as having a game character navigate users’ physical space or transforming a living room into a forest campfire.
To help you reduce friction in setting up your MR experiences using new and updated capabilities, we’re also introducing Space Setup, a feature that provides automated room layout detection so your apps can understand and react to physical spaces with less work on the back end for you.
MR on Meta Quest leverages spatial data (scene, mesh, and depth API data) of the environment to blend physical and virtual content. To help you and your audience understand how apps use spatial data—and how people can manage apps’ access to it—we’ve also created new
resources to reference as you build your next MR experience.
Jumpstart Your Mixed Reality Journey with New Tools and Showcase Apps
Developing for MR may seem challenging, but we’ve worked to produce tools and resources that can help you get started with multiple game engines. Our new
MR design guidelines break down the basics of MR, explain what you’re able to do with it, and provide best practices for designing engaging MR experiences.
If you want to get started or optimize how you build with this exciting technology, we have some great updates for you.
Mixed Reality Utility Kit is a new tool for Unity and Unreal that dramatically simplifies the process of integrating and using Scene capabilities. It also includes helpful utilities for working with procedural meshes and textures, including pre-captured rooms so you can test your app in different environments without putting on your headset. Stay tuned for launch information coming soon.
Phanto is a new open-source reference app built in Unity that highlights the latest Presence Platform features, including Scene Mesh, Scene Model, and Scene API. It also contains example Unity scenes that showcase content placement, fast collision, air navigation, floor navigation and other best practices that you can easily incorporate into MR your app. Check out the full source code on
GitHub.

Discover is an additional open-source showcase built in Unity that demonstrates how to use key MR capabilities, including Scene, Spatial Anchors, Shared Spatial Anchors, and Passthrough, and shows how you can quickly integrate them into your own projects. With the full source code available on
GitHub, Discover is the perfect way to explore the latest in MR technology.
At Connect, we also shared how new and improved tools like Building Blocks and Meta XR Simulator can help you easily integrate the latest MR capabilities from Presence Platform and test your apps without needing to don and doff your headset every time. Stay tuned for more content diving into these announcements.
New WebXR Tools to Accelerate Mixed Reality Development
Now, if you are a webXR developer building MR apps, we have some great updates to share with you. Today we’re excited to officially introduce
Immersive Web Emulator, which lets you easily test and iterate your WebXR experiences without a headset.
Boasting features like 6DOF emulated transform control for headsets and controllers, comprehensive controller input emulation for both binary and analog buttons, and intuitive keyboard mappings for standard actions, Immersive Web Emulator is an open-source web extension for Chromium-based desktop browsers. It’s available through the
Google Chrome Web Store and the
Microsoft Edge Add-ons Store.
The latest version, v1.4, adds full support for all Presence Platform features currently available on the web. With Quest 3, we’re also adding emulator support for Room Mesh, which gives you a detailed 3D representation of a room that you can use to detail hit testing or basic occlusion. Here are some core MR features you can leverage to accelerate MR development:
- Emulated Room Setup
- Additional XRPlane & XRMesh Setup
- Semantic Labels
- Emulated Hand Tracking
- Hit-Testing & Persistent Anchors
If you’re new to building MR experiences with WebXR, you can use
Reality Accelerator Toolkit to help integrate MR features in your app. This toolkit enhances the widely-used Three.js 3D library, introducing bindings for XRPlane, XRAnchor, HitTestSource, and XRMesh, and silently managing backend updates. By encapsulating these features within the Object3D interface, Reality Accelerator Toolkit lets you effortlessly incorporate physical-world elements like planes and meshes directly into their Three.js scene graph.
Looking Ahead
Major hardware upgrades are on the way with
Quest 3, and thanks to a variety of capabilities and tools available to help you integrate the newest features, there’s never been a better time to build MR experiences.