Develop

MRUK sample overview

Updated: May 8, 2026

Overview

This sample demonstrates how to use the Mixed Reality Utility Kit (MRUK) to build spatial experiences that interact with physical surroundings through four experiences: anchor mesh spawning, keyboard tracking, QR code scanning, and raycasting. It is designed for developers who want to learn MRUK fundamentals including scene loading, procedural mesh generation, object tracking, and ray-based interaction.
Note: The keyboard and QR code tracking features use configureTrackers(), which is currently marked as experimental and may change in future SDK versions.

Learning objectives

Complete this guide to learn how to:
  • Load scene data from device storage or JSON files using MRUKFeature
  • Spawn virtual furniture models at detected anchor positions using AnchorMeshSpawner
  • Generate procedural meshes for floors, walls, and ceilings with AnchorProceduralMesh
  • Track physical keyboards and QR codes using configureTrackers() with the experimental tracker API
  • Perform raycasts against scene planes, global mesh, and environment depth using MRUK raycast APIs

Requirements

Device and development environment:
  • Meta Quest 2, Quest 3, or Quest 3S running Horizon OS 69 or later
  • Android Studio with Spatial SDK configured
For detailed setup instructions, see the sample README.

Get started

Clone or download the Meta Spatial SDK Samples repository from GitHub and open the MrukSample project in Android Studio. Build and deploy the app to your device. The sample launches with a 2D menu presenting four experiences as buttons. Select an experience from the start menu to explore a different MRUK capability.
For build prerequisites and troubleshooting, consult the README in the sample directory.

Explore the sample

The sample follows a 2D launcher plus multiple immersive activities architecture. The start menu is a Jetpack Compose UI that launches four separate VR experiences.
File / SceneWhat it demonstratesImplementation notes
MrukSampleStartMenuActivity.kt
2D Compose launcher menu
Standard Android ComponentActivity with SpatialTheme UI, launches immersive experiences via intent
anchor_mesh/MrukAnchorMeshSampleActivity.kt
Replacing physical objects with virtual furniture
AnchorMeshSpawner maps MRUKLabel values to .glb assets, AnchorProceduralMesh generates textured meshes for surfaces, MRUKSceneEventListener callbacks handle room and anchor events
keyboard_tracker/KeyboardTrackerSampleActivity.kt
Tracking physical keyboards
configureTrackers(setOf(Tracker.Keyboard)) starts tracking, TrackedKeyboard component added automatically, custom system queries tracked entities and spawns passthrough cutout meshes
qr_code_scanner/QrCodeScannerSampleActivity.kt
Scanning QR codes and spawning panels
configureTrackers(setOf(Tracker.QrCode)) starts tracking, TrackedQrCode.getPayloadAsString() retrieves QR data, WebView panels display content at QR code positions
raycast/RaycastSampleActivity.kt
Raycasting against scene and depth
Four raycast modes (single, all, global mesh, depth), raycastRoom() and raycastRoomAll() for scene planes, raycastEnvironment() for depth data, arrow entities visualize hit points
common/MrukInputSystem.kt
Controller input handling
Queries AvatarBody to get controller entity, reads ButtonBits.ButtonMenu state for UI panel toggling
common/MrukSampleUtils.kt
HMD and controller access
PlayerBodyAttachmentSystem.tryGetLocalPlayerAvatarBody() pattern for accessing head and hand entities

Runtime behavior

Anchor Mesh experience: The experience requests the USE_SCENE permission on launch and loads scene data from the device, spawning 3D furniture models at detected anchor positions (tables, couches, lamps, plants). Floor, walls, and ceiling receive procedural meshes with carpet and wall textures. A UI panel accessed via the left controller Menu button provides controls for loading different JSON room layouts, toggling global mesh visualization, launching scene capture, and saving scene data to a file.
Keyboard Tracker experience: The experience creates a skybox environment with passthrough enabled and starts the keyboard tracker after requesting scene permission. When a physical keyboard is detected, the system spawns a slightly oversized transparent box mesh at the keyboard’s location using a passthrough cutout shader, allowing the user to see the physical keyboard through the virtual world. The UI panel shows tracker status and a Start/Stop control.
QR Code Scanner experience: The experience enables passthrough mode and starts the QR code tracker. When a QR code is scanned, the sample spawns a small WebView panel near the code that displays the URL content (or performs a Google search if the payload is not a valid URL). The panel smoothly follows the QR code position, adapts its orientation based on whether the code is on a wall or floor surface, and scales up on hover interaction.
Raycast experience: The experience loads scene data and renders wireframe outlines of all detected geometry using a custom outline shader. You point the right controller to cast rays into the scene. Arrow models appear at hit points, showing where the ray intersects surfaces. A UI panel allows switching between four raycast modes: Scene single-hit, Scene all-hits, Global mesh, and Environment depth. The depth mode uses startEnvironmentRaycaster() to enable real-time depth-based raycasting.

Key concepts

Loading scene data

MRUK supports loading scene data from the device or from JSON files. The anchor mesh and raycast experiences demonstrate both approaches.
// Returns CompletableFuture<MRUKLoadDeviceResult>
mrukFeature.loadSceneFromDevice()
For testing without a physical space setup, load from a bundled JSON file:
mrukFeature.loadSceneFromJsonString(jsonText)
Both methods are asynchronous and return a CompletableFuture that completes when loading finishes. The anchor mesh experience includes UI controls for selecting scene models (V1, V2, V2 with fallback) and loading from 12 different pre-configured JSON room layouts.
See MrukAnchorMeshSampleActivity.kt for the complete implementation.

Spawning furniture at anchors

AnchorMeshSpawner replaces detected physical objects with 3D models. Pass a map of MRUKLabel values to .glb file paths:
val spawner = AnchorMeshSpawner(mrukFeature, mutableMapOf(
    MRUKLabel.TABLE to AnchorMeshSpawner.AnchorMeshGroup(listOf("Furniture/Table.glb")),
    MRUKLabel.COUCH to AnchorMeshSpawner.AnchorMeshGroup(listOf("Furniture/Couch.glb"))
))
The spawner automatically instantiates models at anchor positions as scene data loads. For a complete list of supported labels and asset mappings, see MrukAnchorMeshSampleActivity.kt.

Generating procedural meshes

AnchorProceduralMesh creates textured geometry for surfaces like floors, walls, and ceilings:
AnchorProceduralMesh(
    mrukFeature,
    mapOf(
        MRUKLabel.FLOOR to AnchorProceduralMeshConfig(carpetMaterial, true),
        MRUKLabel.WALL_FACE to AnchorProceduralMeshConfig(wallMaterial, true)
    )
)
The anchor mesh experience uses this pattern to render carpeted floors and textured walls, with physics colliders enabled for spatial interaction. See MrukAnchorMeshSampleActivity.kt for material setup and configuration.

Tracking keyboards and QR codes

The experimental tracker API enables detection of physical keyboards and QR codes. Both features require scene permission and use the same configuration pattern:
mrukFeature.configureTrackers(setOf(Tracker.Keyboard))
The method returns a CompletableFuture<MRUKStartTrackerResult>. When a tracked object is detected, the SDK automatically creates an entity with a TrackedKeyboard or TrackedQrCode component. The keyboard tracker experience queries for these entities and spawns custom meshes:
Query.where { has(TrackedKeyboard.id) }.eval()
For QR codes, retrieve the payload data:
qrCodeEntity.getComponent<TrackedQrCode>().getPayloadAsString()

Raycasting against scene geometry

MRUK provides three raycast methods: raycastRoom() for single hits, raycastRoomAll() for all hits along a ray, and raycastEnvironment() for depth-based raycasts. The raycast experience demonstrates all three modes plus global mesh raycasting.
val hit = mrukFeature.raycastRoom(
    roomUuid, origin, direction, maxDistance, SurfaceType.PLANE_VOLUME)
For depth raycasts, start the environment raycaster first:
mrukFeature.startEnvironmentRaycaster()
val result = mrukFeature.raycastEnvironment(origin, direction)
The raycast experience uses UpdateRaycastSystem to compute rays from the right controller pose and visualize hit points with arrow entities. See UpdateRaycastSystem.kt for the hit visualization logic.

Extend the sample

  • Combine multiple MRUK features: Integrate furniture spawning with raycasting to allow users to place objects at raycast hit points. Use the anchor mesh experience as a base and add controller-driven placement logic from the raycast experience.
  • Add custom tracker behaviors: Extend the keyboard tracker to spawn interactive UI panels above the keyboard, or enhance the QR code scanner to trigger specific actions based on payload patterns (e.g., launch URLs, spawn prefabs, adjust scene lighting).
  • Visualize scene event callbacks: Implement the MRUKSceneEventListener interface to display real-time notifications when rooms or anchors are added, removed, or updated. Use the anchor mesh experience’s listener implementation as a reference.
For more advanced spatial interaction patterns, explore the Interaction SDK samples.