Develop

MR Utility Kit sample

Updated: May 7, 2026

Overview

This sample demonstrates the core features of the Mixed Reality Utility Kit (MRUK) plugin for Unreal Engine through 10 self-contained scenes covering scene understanding, object placement, and mixed reality rendering. If you are an Unreal Engine developer building mixed reality applications for Meta Quest, these scenes provide working examples of every major MRUK API.

Learning objectives

Complete this guide to learn how to:
  • access real-world room geometry (walls, floors, ceilings, furniture) from the Scene Model using MRUK APIs
  • spawn and position virtual objects on physical surfaces using anchor-based placement
  • generate procedural meshes, navigation meshes, and distance maps from MRUK scene data
  • integrate MRUK with Unreal Engine’s Procedural Content Generation (PCG) framework for scene decoration
  • use specialized MR features like passthrough room lighting, guardian customization, and environment raycasting

Requirements

  • Meta Quest 2, Quest 3, or Quest 3S running Horizon OS
  • Unreal Engine 5.3 or later with the OculusXR plugin installed
For setup instructions, see the Meta Quest Developer Hub documentation.

Get started

Clone the repository from GitHub, open the project in Unreal Engine, and build for Android. The default startup map is Core.umap, which provides a visual overview of MRUK anchors and raycasting. For detailed build and configuration steps, see the project README.

Explore the sample

File / SceneWhat it demonstratesKey concepts
Core.umap
Core MRUK anchor visualization, raycasting, and API exploration
UMRUKSubsystem, AMRUKRoom, semantic labels, raycasting
DestructibleMesh.umap
Destructible mesh interaction with room geometry
Room mesh as collision geometry, physics interaction
DistanceMap.umap
Distance map generation and visualization
MRUK distance map API, proximity-based effects
EnvironmentRaycast.umap
Raycasting against real-world scene geometry
MRUK environment raycasting, hit results
Guardian.umap
Guardian boundary visualization and customization
MRUKGuardianSpawner, boundary visibility
Navmesh.umap
Navigation mesh generation with AI character pathfinding
Dynamic NavMesh from MRUK room geometry
PTRL.umap
Passthrough room lighting
Passthrough rendering with dynamic lighting
RandomPlacement.umap
Random object placement on MRUK surfaces
Anchor-based spawning, surface detection
SceneDecoration.umap
PCG-based procedural scene decoration using MRUK scene data
PCG framework integration, custom PCG nodes

Runtime behavior

When you run the Core scene on a Meta Quest device, it displays your real-world room geometry with color-coded semantic labels: green for floors, blue for walls, white for ceilings, purple for invisible wall faces, light blue for window frames, and brown for door frames. You interact with anchors using motion controller raycasting.
Other scenes add interactive elements: destructible meshes that respond to physics, AI characters that navigate using dynamically generated navmeshes, and procedurally spawned decorations that adapt to your room layout.

Key concepts

Subsystem access

The MRUK subsystem is the entry point for all MRUK functionality:
const auto Subsystem = GetGameInstance()->GetSubsystem<UMRUKSubsystem>();

Async room loading

The sample handles asynchronous room creation by checking the subsystem’s load status and registering a delegate. If the scene is already loaded, the sample processes it immediately; otherwise, it waits for the OnRoomCreated event.

Room mesh generation

The RoomMesh class demonstrates how to extract geometric data from MRUK’s scene representation. The sample iterates through FMRUKRoomFace structures, each containing vertex indices and a semantic classification, to build a procedural mesh with colors based on semantic labels like FMRUKLabels::Floor, FMRUKLabels::WallFace, and FMRUKLabels::Ceiling.

PCG integration for scene decoration

The SceneDecoration sample feeds MRUK scene data into Unreal’s Procedural Content Generation framework using a custom PCG node (BP_SceneRayAnchor) that encodes raycast parameters into PCG point attributes. The sample combines this with distance map data to filter spawn points by proximity to walls.

Extend the sample

  • Modify the semantic label colors in RoomMesh.cpp to create different visual themes.
  • Add new positioning methods to the RandomPlacement sample to explore surface placement algorithms.
  • For more advanced PCG-based scene decoration, see the Unreal PCG documentation.