Prepare for Meta Connect with a Recap of the Biggest Launches and Updates from 2023
Meta Connect 2023 will take place September 27 – 28. Our product teams are busy fine tuning a must-see lineup of sessions designed to support and elevate your experience building on the Meta Quest Platform. Register today and get ready to dive deep into topics that matter to you, like brand-new tools to streamline development, opportunities to build with world-blending capabilities, strategies to grow your business on the Meta Quest Store and App Lab, and last but certainly not least, Meta Quest 3.
Over the last year, we’ve made improvements across many core tools and capabilities that can deliver tangible, positive impacts to both your business and app experiences. To help you get ready for many of the products and topics we’ll dive into during Connect, we’ve prepared a recap of the biggest launches and updates from 2023.
Get familiar with the essential products shaping development on Quest and find links to learn more below.
Heighten Immersion with Capabilities from Presence Platform
Presence Platform is a collection of SDKs containing cutting-edge features you can use to deliver more realistic MR, interaction, and voice experiences. These tools can help you blend worlds with mixed reality and deepen the immersive nature of your app by inviting users to play, explore, and connect using multiple senses and input methods.
Hand Tracking v2.2 delivers improved hand responsiveness, including up to 40% latency reduction in typical usage and up to 75% during fast movement. Developers building fitness apps or experiences with high-intensity movements can turn on Fast Motion Mode (FMM) for even more responsiveness. In addition, with Interaction SDK v56, you can now also use Multimodal to unlock hands+controller interactions, and Capsense Hands to easily emulate hands while using controllers. To set up Hand Tracking, please visit the documentation (Unity | Unreal | OpenXR | WebXR).
Multimodal and Capsense Hands are available as experimental features with SDK v56.
Haptics SDK and Meta Haptics Studio give you the tools to tap into peoples’ sense of touch and deliver natural interactions with nuanced notifications, realistic clicks, and lifelike textures. The SDK can help you quickly test and integrate high-quality haptic experiences, while Haptics Studio, a desktop application for Mac and Windows (with an accompanying VR app), allows sound designers and developers to both create and audition haptics for Meta Quest. Developers working in Unity can get started by visiting the documentation (Haptics SDK | Haptics Studio), with support for Unreal Engine and custom game engines coming later this month.
Shared Spatial Anchors is an extension of the Spatial Anchors API to bring people together for local multiplayer experiences. People can get together in the same physical space to join in shared blend between the real and the virtual world where only mind is the limit. They can sit at the table and play a virtual board game or engage in a slime ball fight without the mess. To get started with local multiplayer experiences refer to our documentation: (Unity | Unreal | OpenXR)
Local multiplayer experiences built with Shared Spatial Anchors.
Audio SDK was added to Presence Platform earlier this year to help developers properly localize sounds in 3D space—allowing users to be fully immersed in the auditory scene. Audio SDK is compatible with all of our Quest devices, as well as Android and PC VR devices from other manufacturers. Developers working in Unity can start integrating immersive audio experiences by visiting the documentation (Unity).
Universal Head-Related Transfer Function (Universal HRTF) is an upgrade of the spatial audio within Audio SDK that delivers an improved experience in two main areas: localization and frequency accuracy. Improved localization lets people more precisely detect where a sound is coming from within a given space, particularly when judging the elevation of sounds coming from above or below them. Improved frequency accuracy means that sounds are much more natural, with less coloration and filtering. Learn more about Universal HRTF by reading our blog post.
Discover is a new open source showcase built in Unity that demonstrates how to use key MR capabilities, including Scene, Spatial Anchors, Shared Spatial Anchors, and Passthrough, and shows how you can quickly integrate them into your own projects. If you’re looking for more showcases to accelerate development, check out Decommissioned to see the possibilities of Quest’s social gameplay and Ultimate Glove Ball to see what it takes to build the next great esport.
Meta’s Virtual Keyboard enables a consistent, familiar, and immersive text entry experience for users and provides an easy-to-implement platform capability for developers. You can use this simple and intuitive solution in a variety of use cases, like login forms, search bars, messaging, and more. Developers working in Unity can integrate the Virtual Keyboard by visiting the documentation.
Boost Performance & Sharpen Visuals
In case you missed it: Starting with v55, Meta Quest headsets have access to new features to improve performance. We’ve been focused on opening up more power for you to work with across our family of headsets while reducing throttle, and we’re excited to share that internal tests led to significant performance increases:
CPU: Up-to 26% performance increase on Quest 2 and Quest Pro
GPU: Up-to 19% speed increase on Quest 2 and 11% on Quest Pro
To learn more about the performance optimizations with v55, including information about the new CPU level, please visit the documentation. Here are others tools you can use to deliver a better user experience:
Meta Quest Super Resolution is a VR-optimized edge-aware scaling and sharpening algorithm that maximizes your app’s image quality on Quest’s high-resolution displays. Built on Snapdragon Game Super Resolution with Meta Quest-specific performance optimizations developed in collaboration with the Qualcomm Graphics Team, Super Resolution can deliver a higher fidelity experience for your audience. Learn how to enable Super Resolution and find best practices by reading our blog post.
Dynamic Resolution is an easy-to-integrate graphics optimization feature that automatically adjusts the resolution during heavy GPU work and increases image quality when possible. Dynamic Resolution can help you deliver consistent, high-quality visuals for your audience by maintaining your app’s frame rate while rendering at an optimal resolution. You can also use it to achieve optimizations like Dynamic Foveated Rendering (both Fixed Foveated Rendering (FFR) and Eye Tracked Foveated Rendering (ETFR)), where the runtime will try to increase the foveated rendering level first before starting to scale the resolution down if the GPU utilization gets too high. Find best practices and get started by visiting the documentation (Unity | Unreal).
Grow Your Business with MR and New Best Practices
Our new MR Developer Success Spotlight featuring Thomas Van Bouwell (Cubism) and Nanea Reeves (TRIPP) shows just how powerful MR capabilities like Passthrough can be for your audience—and your business. Incorporating Passthrough helped these teams unlock new gameplay and drive retention, like TRIPP boosting its Daily Active Users (DAU) seven-day average by 31%.
A “dimmer switch” feature in TRIPP lets people decide how immersed they want to be.
And hot off the presses, we just released a VR/MR Capture Best Practices Playbook to help you understand how to capture polished video recordings for any category of app—giving you the tools to produce impactful assets that evoke excitement and drive conversions on the Meta Quest Store and App Lab. Download the playbook here.
Tools to Build Faster and Smarter
Whether you’re iterating on your next prototype or integrating new features that users will love, efficiency is important throughout the entire development process. New tools are helping developers accelerate development and eliminate time-consuming steps like donning and doffing their headsets.
Immersive Web Emulator lets you easily test and iterate your WebXR experiences without a physical XR device. In addition to being able to simulate XR devices and features for immersive VR experiences, the latest version, v1.4, also supports all Presence Platform features on the web. These features include plane-detection and mesh-detection with semantic label support, hit-test and Spatial Anchors, and Hand Tracking. Find install links and learn more about how to improve your WebXR workflow by visiting the documentation
Immersive Web Emulator v1.4
We’ve also launched Reality Accelerator Toolkit, a WebXR utility library designed to make integrating MR features in web apps a breeze. The Accelerator is compatible with the three.js 3D library, which bridges the gap between low-level WebXR APIs and higher-level APIs from three.js. For more information on Reality Accelerator Toolkit, click here.
Discover New Monetization Options on the Meta Quest Store
We’ve added several new monetization features to help you build an audience before and after launch and drive visibility on the Meta Quest Store. Self-Serve Pre-Order and Coming Soon Listings make it easier for people to find and lock-in purchasing your app before it launches, while Promotions and Bundles give you flexibility to create your own app discounts and bundle apps or content together. If you’re interested in opening up additional streams of revenue with Subscriptions, the new Subscribe from PDP feature lets people purchase your subscription directly from your app’s product detail page on web, mobile, and VR surfaces.
If you haven’t tried using the A/B Testing Tool to compare the performance of different assets on your app’s product detail page, check out our recent case study to learn how the developers of SUPERHOT VR, Rec Room, Crisis Brigade 2 reloaded, and Shores of Loci are leveraging self-service A/B testing at different stages of their apps’ lifecycles to drive sales on the Meta Quest Store.
Register for Meta Connect
This is just a start to the many topics we’ll dive into at Connect. Register today and stay up to date with the latest developer news by following us on Facebook and Twitter/X.
Hand tracking
OpenXR
PC VR
Presence platform
Quest
Unity
Unreal
WebXR
Did you find this page helpful?
Explore more
VR Developer April Roundup: Open Tools, Smarter Profiling, and a Predictable Roadmap
This month's VR developer updates: Haptics Studio goes open source, Runtime Optimizer gets AI analysis, plus GDC VODs and OS v201 details.
Accelerate VR Development with AI & Immersive Web SDK
Just describe your VR experience. An AI assistant builds, tests, and validates it for you, so you can focus entirely on creative vision and unique gameplay.