At Meta Connect, our executives and product experts took you on a deep dive into several technologies shaping the future—and one area that truly captured the spotlight was mixed reality. New capabilities and advances are enabling developers of all skill levels and areas of expertise to build even more dynamic experiences.
We’ve invested in expanding our developer suite of tools and capabilities to help you streamline development and innovate with groundbreaking mixed reality technologies—whether you’re new to mixed reality or a seasoned pro. The library of mixed reality experiences available on the Meta Horizon OS is expanding to include opportunities for developers with experience building for mobile, web, and other platforms.
Below, we take a closer look at why building mixed reality experiences is more rewarding and easier to start than ever before. Explore announcements from Connect 2024 that reveal how new improvements, tools, and capabilities can elevate your vision and power you to build it out faster.
What’s in store:
- Latest improvements in mixed reality
- New features and capabilities
- Tools and features to build faster for mixed reality
- Interaction SDK and Haptics-related updates
- Build multiplayer mixed reality experiences with Building Blocks
- Performance enhancements
- Market your mixed reality app with LIV SDK
Leverage the Latest Improvements in Mixed Reality
Since last year’s Connect, we’ve made major updates to help you develop mixed reality experiences that feel more dynamic, realistic, and enjoyable without sacrificing performance.
Here’s a brief recap:
- Passthrough improvements: Over the past year, we’ve improved Passthrough to deliver improved resolution, comfort, color tuning, and alignment between virtual and Passthrough hands. Now users will be able to see their physical environment with reduced image and hand distortions, smoother interactions, more balanced representations of colors, and overall better quality.
- Depth API compute improvements: The latest Depth API provides developers with the best of both worlds—occlusions that have better visual quality, with lower compute cost and latency. With improved visual quality, reduced latency (77ms to 63ms), and a 12% decrease in GPU usage, Depth API can help your mixed reality experiences seamlessly blend with the physical world.
- Streamline development for large-scale, multi-room experiences: We recently added the ability to generate a navigation mesh across multiple rooms with the Mixed Reality Utility Kit (MRUK). The MRUK can help facilitate navigation and simulation when building apps that support large-scale and multi-room environments. Now you can use MRUK to easily implement virtual elements like characters, pets, and objects that move across one or multiple rooms. Specifically, MRUK saves you time by allowing you to load multiple rooms in one function call and merge them into a single play area.
Enrich Your Apps with New Capabilities
We shared the newest additions to our expanding suite of mixed reality capabilities that allow you to support more vibrant expressions, fluid interactions, and realistic spatial sound.
- Audio-to-Expressions: Audio-To-Expressions uses an app’s audio stream to provide full facial animation—without face tracking. You can use this technology to produce full facial representations of people using only their voices, allowing you to create experiences where people feel more connected and present with each other. Now while users are speaking, laughing, or even coughing, their expressions will become more meaningful and indicative of their emotion.
- Acoustic Ray Tracing (ART): ART was designed specifically to meet the challenges of bringing accurate, immersive acoustics to any virtual environment. This feature simulates how sound waves actually propagate through space to help you produce more encompassing, accurate, and natural spatial audio environments. With easy implementation and flexible tuning controls, ART helps you make your app sound just right, even if you don't have a background in audio or sound design. To get started with ART, visit the documentation (Unity - Native, FMOD, Wwise, | Unreal - Native, FMOD, Wwise).
- Microgestures: Microgestures expand the possibilities and ease of hand interactions in mixed reality by recognizing finger-based gestures, including thumb swipes and taps along the side of the index finger to trigger discrete actions. These gestures can be leveraged in scenarios like 2D menu navigation, UI shortcuts, and locomotion to improve comfort and ease of use. Stay tuned in for the launch of microgestures later this year.
- Passthrough Camera Access API: Passthrough camera access is coming to Meta Horizon OS early next year. Access to the camera feed unlocks a wide variety of new experiences in mixed reality including object tracking, scene understanding, and more. We will share more with you soon and we can’t wait to see what you build with it.
Example featuring locomotion triggered by microgestures
Build Mixed Reality Experiences Faster with New Features and Support
During the developer keynote, we announced that Meta Quest devices are Unity 6 ready, and we will have a new build-target and cache improvements that reduce build times by up to 90%. Launching in Unity 6.1, Unity will also be adding a dedicated build profile for Meta Quest, making it significantly easier and faster to configure your projects. The process will be cut from 36 steps to just seven, and when you choose this new platform from the platform browser, Unity 6 provides sensible defaults for platform build settings, preconfigured build profile settings overrides, and recommended and required packages.
Even if you’re relatively new to developing immersive apps, building mixed reality experiences doesn’t necessarily require extensive experience. We’ve introduced several new tools and resources to accelerate your process building both single-player
and multiplayer mixed reality apps, including
MRUK (
Unity |
Unreal), a rich set of utilities and tools on top of
Scene API that are designed to perform common operations.
At Connect, we introduced new additions to MRUK that elevate the tool’s ability to streamline mixed reality development and focus on what makes your app unique.
- Destructible Mesh: This feature enables you to turn a room mesh into a destructible object with endless possibilities.
- Instant Placement: This feature allows you to place panels or other objects directly on the mesh with almost no setup.
Together, these features can help you be more efficient with how you develop against users’ physical environments—and more creative with how users interact with them.
In addition we showcased
Immersive Debugger in our workshops and session.
This convenient companion tool provides a solution for debugging without needing to take off your headset, helping to cut down iteration time across different roles in your organization.
Quickly Enhance Your Unity and Unreal-Based Mixed Reality Experiences with Interaction SDK
Of course, mixed reality also involves interactions between users and virtual elements, including UIs. To help you create high-quality UI that delivers intuitive visual feedback, robust input modality support, and consistent interaction behaviors, we're excited to announce the launch of Interaction SDK UI sets in October on Unity and Figma.
These UI sets provide essential components and patterns that support both controller and hand interactions, aligning with the Meta Quest design system. Plus with theming capability, you can easily customize the UI Sets to reflect your app's unique branding.
Previously, making a canvas pokeable or an object grabbable in Interaction SDK was a time-consuming process involving connecting individual components or inserting them into prefabs. To help you expedite interaction implementation, we recently launched
Interaction SDK Quick Actions. This feature enables you to easily add interactions to your Unity scenes via a right-click context menu. Get started with Interaction SDK Quick Actions by visiting the
documentation.
We’re also excited to announce that Unreal support for Interaction SDK will arrive with v69 on Unreal Engine 5.4 and later. The Interaction SDK Unreal integration works with the Meta XR Unreal Plugin to provide high-quality interactions previously only available for Unity, such as Raycast, Poke, Grab & Throw, and one- and two-handed object transformations.
In v69, we’re also introducing cross-platform support for Meta Haptics Studio (
Windows |
Mac), enabling you to utilize all your haptics assets for Quest devices across PCVR and other HMDs too. With Haptics Studio and Haptics SDK (
Unity |
Unreal |
Native), you have effective solutions for heightening sensation and elevating presence within your mixed reality or VR experience.
Use Building Blocks to Quickly Set Up Multiplayer Mixed Reality Experiences
On the topic of interactions, there’s another important type that can help distinguish your mixed reality experience: users’ interactions with each other. With the help of our new
Multiplayer Building Blocks, you can quickly drag and drop several core multiplayer features included in the Meta XR Core SDK into your Unity project, including colocation, auto-matchmaking, player voice chat, and more. We now have 30 Building Blocks across the board to help you enable a variety of common features and functionality in your Unity projects.
Leverage Performance Enhancements to Build High-Quality Mixed Reality Experiences
Performance is vital to delivering user experiences that feel smooth and support fluid interactions between virtual and physical objects. Since last year’s Connect, we’ve shipped numerous features to elevate CPU and GPU performance and help you improve the quality of your mixed reality experiences.
- Application SpaceWarp (AppSW): AppSW can be an effective tool for increasing GPU overhead thanks to its ability to synthesize frames, but historically this functionality has created a trade-off due to leaving noticeable artifacts, especially with fast-moving objects. We’re excited to share that we’ve successfully reduced these artifacts to make AppSW a more comfortable experience and help developers worry less about this trade-off.
- GPU Frequency Boost on Meta Quest 3 and 3S: For apps that have dynamic resolution (Unity | Unreal) enabled, we’ve been able to further boost your GPU frequencies by up to 10%. All you need to do is enable dynamic resolution and reap the benefits. This boost in GPU frequencies is inversely proportional to how much thermal headroom is available (i.e. if your app has a lot of additional headroom, this feature will provide a greater GPU lift than those that are already pushing the thermal limit).
- Unity CPU Performance with Graphics Jobs: Graphics Jobs enables a significant FPS improvement by moving the bulk of the rendering work away from the main thread and attempting to multithread the render thread. We implemented a custom solution that enables the main thread to be offloaded while avoiding GPU costs of secondary command buffers multithreading the render thread. You can see this feature in action starting next month with Batman: Arkham Shadow, which benefits from a substantial 21% FPS improvement thanks to Graphics Jobs. Be sure to enable this feature on Unity versions that support our recent fix.
Capture and Market Your App Effectively with LIV SDK
As mixed reality has become more robust and supported on Meta Quest devices, we’ve heard your feedback about the need for effective capture tools that go beyond what our Mixed Reality Capture solution offers. Thanks to our recent multi-year partnership with LIV.tv, we’re excited to share that
LIV SDK is now available to help you capture authentic mixed reality content using a PC with an external camera in both immersive VR apps and MR apps.
Now you can conveniently produce trailers, social media content, or any number of video assets leveraging high-quality footage that contains mixed reality capabilities including Passthrough, Spatial Anchors, Scene, Occlusion, and more. With tools designed to help you highlight mixed reality in your app, you can effectively showcase the unique aspects of your experience to a wide audience.
Dive Deeper Into the Biggest Updates from Meta Connect
We can’t wait to see how you use these tools to power your development journey and drive innovation in our shared ecosystem. If you want to learn about more opportunities with mixed reality and Meta Horizon OS, be sure to watch
Connect developer sessions.