Develop

HiFi Scene sample overview

Updated: May 7, 2026

Overview

This sample demonstrates how to access and visualize MRUK’s high-fidelity room mesh, which provides detailed 3D geometry with per-face semantic labels. Unlike standard Scene API data that represents rooms as bounding boxes and planes, the high-fidelity mesh exposes complete vertex and triangle data where each face carries a semantic label (floor, ceiling, walls, doors, and windows).

What you will learn

  • Subscribe to the SceneLoadedEvent to access room mesh data when it becomes available
  • Retrieve and process RoomMeshData from the current MRUK room, including vertices and indexed triangle faces
  • Create Unity meshes with multiple submeshes to enable per-face material assignment based on semantic labels
  • Apply color-coded materials to visualize different semantic surface types (floors, ceilings, walls, door frames, and window frames)
  • Avoid rendering artifacts when overlaying meshes on existing room geometry

Requirements

Device: Meta Quest headset with Space Setup completed
Development environment: Unity 2022.3 or later with MRUK SDK installed
For detailed SDK installation and project configuration, see the MRUK setup guide.

Get started

The sample is available in the Unity MR Utility Kit Sample repository. To run it, open the project in Unity, navigate to the HiFi Scene folder, open HiFiScene.unity, and build to your Quest headset. For detailed build instructions, platform requirements, and troubleshooting, see the sample’s README.

Explore the sample

The sample consists of a single scene with minimal custom code, demonstrating the core workflow for high-fidelity mesh retrieval and rendering.
File / SceneWhat it demonstratesKey concepts
HiFiScene.unity
Scene configuration with MRUK prefab configured for high-fidelity mesh loading from JSON
MRUK DataSource configuration, EnableHighFidelityScene flag, RoomMesh GameObject structure
Scripts/RoomMesh.cs
Complete workflow for retrieving, processing, and rendering the room mesh with semantic-based materials
Event subscription, mesh construction, submesh-per-face architecture, semantic label mapping
Rooms/InnerWalls.json, MultiCeiling.json, MultiFloor.json, SlantedCeiling.json
Four test room configurations with varied geometry (inner walls, multiple floor/ceiling planes, and slanted surfaces)
JSON scene data format with roomMeshMETA section containing base64 vertex data and indexed faces

Runtime behavior

When you run the scene, MRUK loads room data from one of the configured JSON files with high-fidelity mesh enabled. The RoomMesh script subscribes to the scene load event, retrieves the mesh data, and constructs a Unity mesh where each face becomes a separate submesh. The renderer applies color-coded materials based on semantic labels: floors render in green, ceilings in light gray, standard walls in blue, inner walls in darker blue, invisible walls in purple, door frames in brown, and window frames in light blue. The mesh renders slightly offset (1.001x scaling from its centroid) to prevent z-fighting with other room geometry.

Key concepts

Event-driven mesh access

The sample uses MRUK’s SceneLoadedEvent to retrieve room data asynchronously. RoomMesh.cs subscribes in Start() and accesses MRUK.Instance.GetCurrentRoom().RoomMeshData when the event fires. The RoomMeshData property is nullable; the sample checks for null before accessing .Value to retrieve the vertex and face collections.

Submesh-per-face architecture

Each semantic face in the room mesh becomes a separate Unity submesh, enabling independent material assignment. The sample sets mesh.subMeshCount = roomMesh.Faces.Count, then iterates through each face to populate its triangle indices via mesh.SetTriangles(face.Indices, submeshIndex). This architecture allows the renderer to apply distinct materials based on face.SemanticLabel while keeping a single shared vertex buffer across all faces. See the CreateMeshFromRoomMeshData method in RoomMesh.cs for the full implementation.

Semantic label visualization

The sample maps each MRUKAnchor.SceneLabels enum value to a specific color and creates a material using the Universal Render Pipeline’s Lit shader. The SceneLoadedEvent handler builds an array of materials matching the face order, where floor faces receive green (0.2, 0.6, 0.2), wall faces receive blue (0.6, 0.6, 0.8), and so on. This color coding reveals the structure of the semantic mesh, showing how MRUK distinguishes between outer walls, inner partition walls, and architectural features like door and window frames. See the switch statement in RoomMesh.cs for the complete color mapping.

Z-fighting prevention

The sample applies a 1.001x scale to all vertices from their computed centroid before creating the Unity mesh. This slight offset prevents z-fighting when the room mesh renders alongside other MRUK-generated geometry (such as planes or anchor visualizations) that occupy the same spatial positions. The centroid calculation and scaling logic appear in the CreateMeshFromRoomMeshData method in RoomMesh.cs.

Extend the sample

  • Implement custom shaders: Replace the color-coded URP/Lit materials with custom shaders that apply textures, patterns, or effects based on semantic labels. For example, apply a wood texture to door frames or tile patterns to floors.
  • Integrate physics colliders: Add a MeshCollider component to the generated mesh to enable physics interactions with the room geometry. Use the semantic labels to configure per-face physics materials (e.g., bouncy floors, sticky walls).
  • Combine with prefab spawning: Use the semantic labels to drive the Virtual Home sample’s prefab spawner, placing furniture prefabs only on floor faces or artwork only on outer wall faces.