Available for
pre-order today starting at $499.99 USD,
Meta Quest 3 is our next-generation headset designed to support mixed reality experiences in full color. It delivers 10x the pixels of Meta Quest 2 for Passthrough, more natural depth perception, improved hand and body tracking, and many more improved capabilities that enable you to push the limit of what’s possible.
At Meta Connect, we announced a set of tools to help you go from ideation to iteration faster in Unity. Using these tools, you can empower your creative process with the latest capabilities and decrease the time you need to set up and start building.
Keep reading to learn how you can accelerate development Project Setup Tool, Building Blocks, and Meta XR Simulator—and how these tools can work together to enhance your apps.
Reduce Debug and Setup Time with the Project Setup Tool for Unity
Launched with Integration SDK v49, the
Project Setup Tool was designed to help you start building and spend less time setting up our SDKs. The Project Setup Tool automatically alerts you about issues and helps you fix them with one click thanks to its large library of automatic fixes.
To give you clarity into any compatibility, rendering, or SDK issues impacting your project, the Project Setup Tool’s alerts are also categorized based on urgency to help you prioritize what’s required, recommended, and optional.
All developers building in Unity can access the Project Setup Tool today, and we’re continuing to add more automatic fixes and supported APIs with every new SDK release. To get started, please visit the
documentation.
Drag-and-Drop Features to Enrich Your Experience with Building Blocks
Building Blocks is a new experimental feature with Integration SDK v57 designed to help you easily discover and add Presence Platform features like Passthrough, hand tracking, and much more to your apps.
Building Blocks helps you get up and running faster thanks to a library of capabilities that you can simply drag and drop into your project, giving you a robust solution for rapid prototyping and adding new features into a production build. This tool gives developers building in Unity access to a library of “blocks” that contain the features powering MR, interactions, and social presence on Meta Quest headsets. With just a few clicks, you’ll have access to a library of features to create incredible immersive and mixed reality experiences at your fingertips.
Feature descriptions and the ability to search and filter blocks can help you discover new features to enrich your current app. At launch, Building Blocks will include support for 12 blocks:
- Passthrough
- Controller Tracking
- Hand Tracking
- Camera Rig
- Surface Projected Passthrough
- Room Model
- Eye Gaze
- Virtual Keyboard
- Synthetic Hands
- Throwable Item
- Pointable Item
- Pokeable Item
- Grabbable Item
Once you find a block you want to add, you can drag and drop it into your project and get set up in three easy steps. Here’s how it works:
Blocks are classified as ScriptableObjects, and each contains dependencies, setup rules, unit tests, and more to ensure that you have the feature logic you need and that the features work independently and when combined with others. Designing experiences that leverage physical and virtual spaces can be difficult, but by using the provided blocks, you can combine features with greater ease—even for builds already in production.
This also makes iteration a more fun and enjoyable experience by allowing you to jumpstart a project with a working slate of installed features, then customize and configure them to meet the unique needs of your app without requiring excessive time and effort coding.
Building Blocks will be available for experimental use with Integration SDK v57, and in the coming months, we’ll add support for additional blocks, automatic testing, improved UI, and additional improvements. Be sure to check back with the launch of SDK v57 to find links to documentation and get started using Building Blocks.
Faster Developer Iteration with Meta XR Simulator
Meta XR Simulator (
Unity |
Unreal |
Native) lets you quickly and easily test and improve the mechanics, design, and overall user experience of your apps
without needing a physical device. Meta XR Simulator is a lightweight XR runtime that lets you simulate Meta Quest devices and features on the API level, saving you time and energy by achieving faster iteration and scalable automation. Meta XR Simulator makes it easier to do day-to-day development by enabling you to test and debug your apps without needing to put on your headset every time.
You can easily test and improve the mechanics, design, and overall user experience of your apps with Meta XR Simulator. The tool simulates the motion and input of Quest headsets and Touch controllers via keyboard and mouse input (or a game controller), making it easy to quickly iterate on your app. In addition to a more streamlined iteration process, the tool also lets you simulate headset attributes like accurate field of view and correct display resolution for each device. Meta XR Simulator is an invaluable tool for social apps, enabling multiple instances to run on a single machine so a solo developer can simulate and test multiple player interactions.
Meta XR Simulator also supports MR development via the Synthetic Environment Server (SES) that loads synthetic environments that simulate the physical world as well as Presence Platform features like Scene, Passthrough, and Anchors. MR presents new possibilities to enrich your apps with unique ways for your audience to see and experience the world around them, and Meta XR Simulator can help you tune your gameplay across a variety of environments.
Meta XR Simulator supports Unity, Unreal, and Native development. To get started with Meta XR Simulator, please reference the documentation (
Unity |
Unreal |
Native).
How These Tools Can Work Together
Starting and iterating on an MR project can become a more flexible and efficient process when you use these tools together. For example, once you’ve installed the Integration SDK and Meta XR Simulator, you can start up a new Unity project and see that everything is set up correctly in the Project Setup Tool or address specific issues that may prevent your project from running correctly.
Next, you can launch Meta XR Simulator and set up the SES so your project gets room environment data. The Meta XR Simulator menu also provides you with several room options to choose from. After your environment is set up, you can use Building Blocks to drag and drop feature blocks like Passthrough, hand tracking, and many others into your project. Adding the Room Model block can also help you visualize how room data is constructed by automatically marking all the scene anchors in the environment with visible meshes. Now you’re free to explore your MR environment like you would in-headset, but only using your mouse and keyboard or game controller.
While testing and stimulating environments works great when building for MR, testing performance and interactions that use tracking capabilities is best done through a headset-based workflow. Thankfully, if you’re looking to integrate or iterate on interactions in your app, you can drag Interaction SDK features like hand tracking into your project and launch
Quest Link to test them out in-headset.
Grow Your Audience with AR Foundation
In partnership with Unity,
AR Foundation now offers support for Quest 3 and Quest Pro. With AR Foundation, you build MR apps once and release them on any ecosystem of your choice, not just Meta Quest. Creating MR apps for multiple platforms becomes easier thanks to hooks, which let you simply update your package in order to integrate missing platform-specific features as they’re added. Learn more about AR Foundation
here.
Looking Ahead
Mixed reality and Presence Platform’s capabilities present new possibilities for engaging and immersing your audience, and we’re excited to offer you a variety of tools to accelerate your development with this technology. We’ll continue to release and update our tools to make developing for Meta Quest a more flexible, streamlined, and enjoyable experience. For developer news and updates, be sure to follow us on
X/Twitter and
Facebook.