2022 Developer Resource Roundup: Guides, Demos, Samples, & More
When passion fuels your work, sometimes time seems to fly by—and 2022 flew by faster than a speedrun of Resident Evil 4. Before we turn the corner into 2023, we want to point you toward the resources you can use to hit the ground running in the new year. We know that creating a high-quality VR experience is no easy task and requires a tremendous amount of dedication and problem solving—and we want to help you see your hard work come to fruition, whether you’re ideating your next title or preparing for launch.
Throughout the past year, we’ve released a variety of demos, tutorials, case studies, and best practices to elevate how and what you build. But sometimes valuable content can fall through the cracks. That’s why we’ve compiled a roundup of key resources we published in 2022. Dive in to discover ways to optimize your workflow, improve the quality of your app, and reinforce what you already know.
Open world games spark the explorative nature in all of us—but figuring out how to fit your world into memory at runtime is no simple adventure. Our Guide to Asset Streaming for Open World Games demonstrates how you can use asset streaming to save memory—or in other words, only load areas of your game that are in close proximity to the player.
Asset streaming helps you save memory by loading areas far away from the player in lower fidelity.
This guide contains links to documentation and a sample project for Unity that describes the process of:
Evaluating assets
Profiling runtime performance
Generating assets at different levels of detail (LODs)
Creating a system to load and unload LODs based on player position
Writing tools to confirm that the asset management system is working
Using our sample project, you can learn how to optimize runtime throughout all areas of your world, including sub-levels. Perhaps best of all, these principles also apply to components like audio, animation, meshes, and more.
Knowing how much memory your app uses (and where it’s being allocated) can help you ensure your app stays within the constraints of Meta Quest device memory. Check out this blog to discover exactly how our system allocates memory and how you can accurately measure how much memory your app is using.
This chart shows how the GPU may round up memory to be slightly larger than what the texture actually requires.
With tips for reducing memory, instructions for querying memory usage from the GPU, and tables to help you visualize system data usage, this post has something for everyone—even the most seasoned developers.
Application SpaceWarp (AppSW) could be colloquially referred to as developer magic. This feature allows an app to render at half-rate (i.e., 36 FPS vs. 72 FPS) while the system still synthesizes at the full refresh-rate to display, allowing substantially more time for both CPU and GPU each frame. To show you a real-world application of AppSW, we converted Showdown, a 2014 PC VR demo, to run on Meta Quest 2 with the goal of matching the fidelity of the PC version.
With the help of AppSW, the project achieved a consistent 90 FPS. During initial testing, AppSW gave apps up to 70% additional compute, with potentially little to no perceptible artifacts. Since then, Showdown has been updated on both Github and App Lab to support eye tracked foveated rendering (ETFR), which helps reduce the amount of pixels rendered and leads to better GPU savings.
In Showdown, AppSW allowed us the headroom needed to handle more effects from the PC version while still hitting 90 FPS.
In this two-part blog series, we explain how AppSW improved the performance of Showdown and describe optimizations you can make to improve the performance of CPU- or GPU-heavy apps.
It’s no secret that multiplayer experiences account for many of the most popular apps and games of all time. In VR, multiplayer features can help people feel even more connected and present—and they’re a huge driver for growing a vibrant app community.
This four-part blog and video series breaks down Platform SDK multiplayer features individually by exploring our sample project called SharedSpaces, which is available to download on App Lab and GitHub. Using SharedSpaces as a visual guide, you can see each of these features in action and have a reference point when using the sample to create your own simple multiplayer app. Be sure to check out all four posts to understand how each multiplayer feature works together and find best practices for integration.
Part 1 — Platform SDK and Unity SharedSpaces Sample
Part 2 — Setting Up Your Own Copy of the SharedSpaces Sample in Unity
Part 3 — Making a Simple Multiplayer VR App Using the Unity SharedSpaces Sample
Part 4 — Other Multiplayer SDK Features, Travel Reliability, and Best Practices
Our Interaction SDK is opening up new possibilities to integrate compelling hand- and controller-based interactions into a variety of VR experiences. Our hand tracking demo First Hand gives you a hands-on (pun intended) look at these interactions, so you can learn how they may deepen the level of immersion within your app. First Hand was built using Interaction SDK and is available now on App Lab and downloadable on GitHub.
Even if you’ve already integrated hand tracking into your app, this post contains 10 tips that can help you improve functionality and polish your app’s hand-driven interactions.
Integrating hand tracking into an existing app may appear daunting, but many developers have already done so. Along the way, they’ve discovered solutions to common problems as well as best practices to streamline integration.
Smart design can help people navigate and interact in your app by using their hands.
Cyan Worlds, Inc. Director of Development Hannah Gamiel offered our developer community an in-depth look at how hand tracking was implemented in a recent update to the adventure and puzzle game Myst. This post shares insights that you can use for design and navigation when integrating hand tracking. If you want to enhance the hand-driven interactions in your app, be sure to reference this post to find workarounds and solutions that can help you overcome development hurdles when designing for navigation and interaction.
Interaction SDK lets you create high-quality hand and controller interactions by providing modular and flexible components to implement a large range of interactions. This blog and accompanying video describe how Interaction SDK works and how you can replicate its components in your app by using the open source project First Hand and try out each interaction for yourself by downloading our Interaction SDK samples on App Lab.
Interactions like Hand Grab, Gesture, Distance Grab, and many others can enhance people’s sense of immersion in your VR app.
Getting started with Interaction SDK is easier than you may think, and following along with us as we walk through interactions like Poke, Hand Grab, Gesture, and more can give you a solid foundation to work from. If you’ve already integrated the Interaction SDK into your project, be sure to check out the best practices included in this post to strengthen your foundation. Key considerations like visual and auditory feedback can help ensure people have a great experience while using these interactions.
Games
Hand tracking
PC VR
Platform SDK
Presence platform
Quest
Unity
Did you find this page helpful?
Explore more
GDC 2026 Highlights: What's Next on Meta Horizon OS
Catch up on GDC 2026: where VR is headed, what's new in Meta Horizon OS, and the tools and Store updates helping developers ship faster.
All, Apps, Design, GDC, Games, Quest, Unity, Unreal