Now Available: Guide to Asset Streaming in Open World Games
Oculus VR
VR offers one of the most immersive and vast entertainment experiences available, but building expansive open worlds comes with a tricky dilemma: figuring out how to fit the world into memory at runtime.
Loading your entire world on hardware-constrained devices quickly puts you at the limit of the system’s memory. Fortunately, players only interact with one section of a world at a time. This gives you the power to save memory via asset streaming, or in other words, running the area of the world being interacted with at a higher fidelity while loading areas farther away at lower resolution.
You can also choose to not load these areas at all until they are in closer proximity to the player. As a player moves around your world, you can load and unload areas based on the potential for player interaction.
Optimizing memory usage is important to both the visual and audio quality of your app, and hitches can negatively impact the player experience. To help you solve your memory problems, you can leverage documentation and an example project for Unity from the Oculus Studios title Dead & Buried 2. This example project describes the process of evaluating assets, profiling runtime performance, generating assets at different levels of detail (LODs), creating a system to load and unload LODs based on player position, and writing tools to confirm that the asset management system is working.
This in-depth project example also provides steps that can easily be rinsed and repeated to optimize runtime throughout all areas of your world, including sub-levels. Conversely, no two worlds are created equal, which is why we also provide suggestions on other ways to further improve runtime performance.
While this example project focuses on loading level geometry, the core principles still apply to components such as audio, animation, meshes and more. Your world is intended to be experienced at full fidelity, and asset streaming is crucial for scaling your current and future apps for an optimal player experience.
VAIL VR (Part Two): A Look Inside AEXLAB’S Community-Driven Live Ops Engine
Learn how AEXLAB turned rebuilt onboarding after a monetization pivot, then sustained VAIL VR through rapid, measured updates and a tight community feedback loop.
Redefining the Possible: Announcing the $1.5 Million Meta Horizon Mobile Innovation Winners
Meet the winners of the 2025 Meta Horizon Mobile Innovation Competition. See which creators earned a $100k top prize, and explore the winning worlds to learn how they used touch controls and mobile-first design to create standout experiences.