This year at GDC we rolled out the first Product First Experiences at our expo booth. While we featured a handful of groundbreaking products, it was the Meta XR Simulator that took the spotlight. To help more developers access this powerful tool, we recently expanded support to include both PC and Mac devices. This means you have even greater flexibility to achieve rapid iteration and testing on a device of your choosing.
If you missed us at GDC, dive in to learn more about how this amazing Meta tool can accelerate your build process.
Accelerate How You Build for MR
MR is the “
next frontier of immersive simulation technology” because it presents new possibilities to enrich your apps and engage your users in immersive content within their physical environment. To help usher in this technology, we've been looking at ways our tools can accelerate iteration and testing to make it easier for developers of all skill levels to create incredible MR experiences. With this goal in mind, we created Meta XR Simulator, a lightweight OpenXR runtime that lets you simulate Meta Quest devices and features on an API level and offers a variety of benefits when building for both VR and MR.
Now you can test and improve the mechanics, design, and overall user experience of your apps without a physical device. The tool’s ability to streamline iteration, scale automation and simplify testing environment setup eliminates the need to put on and take off your headset until the final testing stages. This means that over time, you can save considerable time and energy during day-to-day development.
Develop for Quest with PC or Mac
We’re committed to expanding our ecosystem to support developers building on devices they’re familiar with so they can develop high-quality apps with greater ease and convenience. Now with our latest update, developers building on Mac devices can choose to iterate and test apps directly on-device, or enjoy low-friction workflows with XR Simulator on Unity and native OpenXR (note:
Meta Quest Link is only supported on PC).
Start with Our New Meta XR Sim Tutorial Video
In our tutorial video, you’ll learn how to get set up with Meta XR Simulator in
Unity and
Unreal so you can start simulating VR and MR features, synthetic environments, and controller inputs on your Mac or PC desktop. The tutorial also walks you through iterating and testing both single player and multiplayer experiences so you can spend more time building and less time debugging. Developers building native apps can also find setup instructions by visiting the
documentation, or following this
instruction if you are using Mac.
Read on below to get ready to build, text, and prototype rapidly with Meta XR Simulator.
Replace Mobile and PC XR Runtimes for Unity and Unreal
You can access Meta XR Simulator as part of the
Meta XR All-in-One SDK, or download and enable it as a
standalone package for Unity, Unreal and Native. The tool comes with an out-of-the-box, predefined input mapping schema and a user interface that provides information on how the runtime is compositing the final view, simulating input and other features.
Once you’ve installed and activated the XR Simulator, you can choose the Meta Quest device you want to simulate, switch the view between eyes, get details on composition layers, check headset and controller status and record sessions for faster debugging.
Test your project using multiple inputs
Meta XR Simulator allows you to test both VR and MR projects to test movement and explore the relationship between virtual objects and physical spaces. We know that developing for MR presents unique challenges with building for 6DOF controllers, which is why we recently launched the
Data Forwarding update.
Data Forwarding lets you control the Meta XR Simulator with Meta Quest controllers by connecting a Quest headset to your desktop without the need to don and doff your headset. After activating Data Forwarding from MQDH or the in-headset app panel, you can leverage complex inputs interchangeably using Quest controllers, game console controllers, or your mouse and keyboard.
Support MR Development with the Synthetic Environment Server (SES)
Meta XR Simulator comes loaded with three primary synthetic environments that simulate the physical world.
We’re also introducing eight new synthetic room environments with v66 so you can test your project in spaces with a variety of dimensions.
- Room with staircase
- Office
- Trapezoidal Room
- Corridor
- Small Room with Numerous Furniture
- Living Room with Multiple Spaces
- Shape Room
- High Ceiling Room
Using these environments enables you to test and finetune your use of Presence Platform features like Scene, Passthrough, and Anchors, which are core components to most MR experiences.
Test Multiplayer Apps with a Single User
Meta XR Simulator allows you to rapidly test your multiplayer MR game without bringing in additional users for a test session. Here’s how to get started testing multiplayer apps with a single user:
1. Start a Synthetic Environment Server and keep it running in the background.
2. Launch multiple Simulatorinstances at the same time leveraging the following options:
- Cloning your project: cloning your project so each copy can be opened in a separate Unity Editor window. Then launch Meta XR Sim in each window by entering play mode.
- App standalone binary: build your application as a standalone binary on Windows and set the system active XR Runtime to Meta XR Sim.
3. Coordinate Input for all players by providing input manually or by using the record and replay functionality.
For additional resources on building for MR, check out our latest
Mixed Reality showcase Phanto. This project demonstrates how to create action-packed MR gameplay with Presence Platform features such as Scene Mesh, Passthrough and Depth API.
Achieve Automated Testing with Session Capture:
Prior to launch, fixing performance issues via frequent testing of single and multiplayer apps is critical for delivering a smooth user experience. Leveraging Meta XR Simulator’s Session Capture features (record and replay) enables you to replay a sequence of inputs to test your project using the same algorithm. You can also enable automated replays, which will appear and close out in quick succession for faster debugging with minimal input.
Get started today with Meta XR Sim
To get started with faster workflows, streamlined iteration and automated testing, please reference the tutorial video and Meta XR Simulator documentation (
Unity |
Unreal |
Native). You can also visit
this page to find the direct download of the latest version along with updated release notes. XR Sim is available for use with Mac and PC.