MIXED REALITY WITH SCENE MESH AND SCENE API
When designing First Encounters, the team knew it needed an MR app that 1) demonstrated the types of experiences MR can unlock on Meta Quest 3 and 2) delivered gameplay that felt physical, grounded, and had realistic interactions. To achieve this vision, they relied on extensive use of Scene API which helped the app understand a player space and integrate virtual elements naturally into the real world.
First Encounters is the first MR experience many Meta Quest users ever engage with. Understanding the importance of first impressions, the team aimed to fully utilize players’ physical space to authentically demonstrate how MR can seamlessly blend environments to create interactive and enjoyable experiences. They decided to create a game in Unity centered around a set piece and core mechanic in which puffians burst through players’ walls to reveal a new, virtual world. The goal was for players to shoot and capture puffians throughout their physical environment before their walls crack and burst completely. In the mayhem, puffians would showcase scene understanding capabilities that enable them to float and bounce around physical objects to escape capture—as if they are actually in the same space as the player.
To deliver functionality that understands players’ space and creates virtual holes on physical walls with low processing cost, the team deployed a combination of Scene Anchors, Scene Mesh, Passthrough, and a custom DestructibleMesh system.
The use of Scene Anchors enabled primitives such as walls, floors, and ceilings to be labeled and represented as simple planes. Querying these primitives provides the system with a basic breakdown of players’ space. Leveraging Scene Mesh expanded this breakdown with an even better approximation of physical space by providing scanned triangle meshes of a player’s room.
The next step was deploying a solution that enabled players to see and navigate their physical space while interacting with virtual objects. Applying a Passthrough shader to the material of the Scene Mesh, the user could see their room as normal—until puffians start punching holes in the mesh.
After evaluating existing solutions built into Unity that would allow for virtual holes to be punched in walls and ceilings, the team opted for a custom solution that allowed them to move the processing cost to the start of the game and increase visual quality. The result was the DestructibleMesh system, which uses Voronoi partitioning to sort the Scene Mesh triangles into submech chunks that can be given their own GameObjects within Unity and individually disabled as they are destroyed. The DestructibleMesh system was written to use the Unity Jobs System for parallelization and the Burst compiler for fast native performance and SIMD vector instructions.
By expanding its use of Scene Anchors and Scene Mesh with the DestructibleMesh system, the team achieved seamless destruction mechanics that overcame processing costs concerns, limitations in available texture data provided by Scene Mesh, and visual constraints preventing objects from being deformed in Passthrough. The result is a fun, action-packed experience that not only opens players' walls but opens up their imaginations to the many possibilities of MR.