Faster Iteration for Quest Developers with Meta XR Simulator (Experimental)

Blog Hero Image
Updated October 2023: Meta XR Simulator is now out of preview and was fully released at Connect 2023. Meta XR Simulator makes it easier to do day-to-day development by enabling you to test and debug your apps without needing to put on your headset every time. Meta XR Simulator also supports MR development via the Synthetic Environment Server (SES) that loads synthetic environments that simulate the physical world as well as Presence Platform features like Scene, Passthrough, and Anchors. Meta XR Simulator supports Unity, Unreal, and Native development. To get started with latest Meta XR Simulator, please reference the documentation (Unity | Unreal | Native).
At Meta Connect 2022, we were thrilled to give you a preview of Meta XR Simulator. Since then, we’ve been busy finetuning things to make sure it’s immediately impactful and effective for our developer community. And today, we’re excited to announce the experimental release of Meta XR Simulator for Unity with SDK v49.
Meta XR Simulator is a lightweight OpenXR runtime that lets you simulate Meta VR devices and features on the API level—saving you time and energy by achieving faster iteration and scalable automation. Below we dive into how it can improve your development cycle with key functions.

Faster Iteration

Meta XR Simulator makes it easier to do day-to-day development by enabling you to test and debug your apps without needing to don and doff your headset every time. Now, you can quickly test features and functionality on your desktop, with the ability to choose between Meta Quest 2, Meta Quest Pro, and Rift S devices.
You can easily test and improve the mechanics, design, and overall user experience of your apps with Meta XR Simulator. The tool simulates the motion and input of both Meta Quest headsets and Touch controllers via keyboard and mouse input (or a game controller), making it easy to quickly iterate on your app experience. In addition to a more streamlined iteration process, the tool also allows you to simulate headset attributes like accurate field of view and correct display resolution for each VR device.

Scalable Automation

Whether you’re new to VR development or an industry veteran, Meta XR Simulator can help you scale automation by simplifying your testing environment setup. It’s range of features lets you:
  • List all the composition layers, with pose / extent data and a thumbnail of its swapchain
  • Preview a composition layer, with single eye or stereo mode
  • Record and replay head poses and input actions
After importing v49 and activating the OpenXR backend of the OVRPlugin, you’ll be able to access both the simulator binary build and OpenXR runtime configuration file from within your assets folder.

Getting Started

To get started with Meta XR Simulator, check out our documentation. Unreal Engine and native OpenXR support will be provided in future Oculus SDK releases.
You can access the Meta XR Simulator from the Unity editor menu, and once activated you can use it to test simple app logic, record and playback action sequences, simplify automatic testing environment setup, and more. You can also use the simulator to easily visualize the stereoscopic layer contents if you’re using OVROverlay.
Upon opening the simulator, the Debug Window is likely the first thing you’ll notice. The Debug Window can be an invaluable asset to help you run a variety of useful functions during iteration and testing.
The Meta XR Simulator Debug Window helps run a variety of functions during iteration.
We’re thrilled to provide you with a simple and accessible means of simulating an in-headset environment, and we’re confident that you’ll find it valuable throughout your development cycles. The tool is currently an experimental release, and we look forward to hearing your feedback as we continue to improve its feature set and functionality over the coming months.

Boost WebXR Development with Immersive Web Emulator

If you’re a developer working with WebXR, you can also use our new Immersive Web Emulator to easily test and iterate your WebXR experiences without a physical XR device. Learn more about Immersive Web Emulator, or get started by viewing the instructions on GitHub and downloading the extension from the Chrome Web Store or Edge Add-ons.
Automation
OpenXR
Presence platform
Quest
Rift
Unity
Unreal
WebXR
Did you find this page helpful?
Explore more
The State of VR at GDC 2026: Building a Sustainable Future
Explore the state of VR from GDC 2026: stronger app discovery, growing Meta Quest usage, more $1M+ titles, and much more.
All, Design, Games, Hand tracking, Optimization, Quest, Unity, Unreal
Faster Builds, Smarter Discovery, and the LiveOps Playbook: What to Know After GDC Day 2
Explore Day 2 at GDC 2026: tools to speed up builds, optimize Store discovery, and learn LiveOps strategies from Gorilla Tag.
All, Apps, Design, GDC, Games, Optimization, Quest, Unity, Unreal
Highlights from Day 1 at GDC 2026: Hands, Agents, Performance & More
Explore GDC 2026 Day 1: hand tracking design, AI-assisted Unity workflows, and data-driven retention tips for VR developers building for Meta Horizon OS.
All, Apps, GDC, Games, Hand tracking, Quest, Unity, Unreal

Get the latest updates from Meta Horizon.

Get access to highlights and announcements delivered to your inbox.