Develop

Mixed Reality sample overview

Updated: May 7, 2026

Overview

This sample demonstrates how to build a mixed reality (MR) experience where virtual objects interact with your physical room using the Meta Spatial SDK. It shows how to use the Mixed Reality Utility Kit (MRUK) to load scene data, create invisible physics meshes on real-world surfaces, and spawn dynamic objects that collide with your walls, floor, and furniture.

What you will learn

  • Enable passthrough rendering and load scene data from the device using MRUK
  • Create invisible collision meshes on detected room surfaces
  • Spawn physics-enabled objects that bounce off real-world geometry
  • Build custom systems that respond to controller input
  • Position UI panels relative to the user’s headset in a mixed reality environment

Requirements

  • Meta Quest device running Horizon OS with scene model permission enabled
  • Development environment configured for Spatial SDK
For detailed setup instructions, see Getting started. For build prerequisites, see the sample README.

Get started

Clone the Meta Spatial SDK Samples repository from GitHub and open the MixedRealitySample project in Android Studio. Build and deploy the sample to your Meta Quest device.
For detailed build instructions and dependency setup, see the sample README.

Explore the sample

The sample demonstrates a complete mixed reality physics interaction loop through four core components:
File / SceneWhat it demonstratesKey concepts
Entry point that configures MRUK, Physics, and passthrough features
Feature registration, scene permission handling, procedural mesh spawning, passthrough enablement
Custom system that spawns basketballs on controller trigger press
Controller input detection, dynamic entity creation, physics configuration
Custom system that positions the UI panel in front of the user
PlayerBodyAttachmentSystem integration, HMD-relative positioning
Jetpack Compose UI with room setup and debug controls
Compose integration, scene.requestSceneCapture() invocation
Composition/Main.scene
Static scene entities: default floor, basketball mesh template, three target objects, UI panel
Meta Spatial Editor scene composition, static vs. dynamic entity pattern

Runtime behavior

When you run the sample, you see your real room through passthrough rendering with a floating UI panel positioned 2.5 meters in front of you at 1.1 meters height. Three target objects appear positioned around the room. Pressing either controller trigger shoots a basketball from the controller position. The balls have sphere physics with very high restitution (0.99), so they bounce energetically off invisible collision meshes generated from your room’s walls, floor, ceiling, table surfaces, and other detected geometry. Each ball spins as it moves and auto-destroys after 8 seconds. The “Set up room” button on the UI panel launches the system room capture flow to reload room geometry, and the “Toggle debug” button renders physics collision wireframes so you can see the invisible meshes.

Key concepts

Loading and using scene data

The sample uses MRUKFeature to load your room’s spatial data from the device:
mrukFeature.loadSceneFromDevice().whenComplete { result, ex ->
  // Room data now available
}
The activity also attaches a MRUKSceneEventListener that destroys the default floor entity when a real room is detected. See MixedRealitySampleActivity.kt for the complete implementation. For more on MRUK scene management, see Scene understanding.

Creating invisible physics collision meshes

The sample creates invisible collision geometry on all detected room surfaces using AnchorProceduralMesh:
val procMeshSpawner = AnchorProceduralMesh(mrukFeature, mapOf(
  MRUKLabel.FLOOR to AnchorProceduralMeshConfig(null, true),
  MRUKLabel.WALL_FACE to AnchorProceduralMeshConfig(null, true)
  // ... additional surface types
))
The first parameter specifies the material (null for invisible meshes so passthrough shows through). The second parameter enables physics collision. See MixedRealitySampleActivity.kt for the complete surface type mapping. For more on procedural mesh generation, see MRUK reference.

Spawning dynamic physics objects

The BallShooter system demonstrates how to create physics-enabled entities at runtime:
val physics = Physics().apply {
  shape = "sphere"; dimensions = Vector3(.12f, .12f, .12f)
  state = PhysicsState.DYNAMIC; restitution = .99f
}
The high restitution value (0.99) makes the balls very bouncy. The system also applies linear and angular velocity to make the balls move and spin. See BallShooter.kt for the complete spawning logic. For more on physics configuration, see Physics.

Positioning UI relative to the user

The UiPanelUpdateSystem shows how to position a UI panel in front of the user’s headset:
val head = systemManager.tryFindSystem<PlayerBodyAttachmentSystem>()
  ?.tryGetLocalPlayerAvatarBody()?.head
The system places the panel 2.5 meters forward at a fixed height of 1.1 meters, then rotates it to face the user. See UiPanelUpdateSystem.kt for the complete positioning logic.

Enabling passthrough and handling permissions

The sample enables passthrough rendering with scene.enablePassthrough(true) and requests scene permission using the standard Android permission flow for com.oculus.permission.USE_SCENE. See MixedRealitySampleActivity.kt for the complete permission handling sequence. For more on passthrough configuration, see Passthrough reference.

Extend the sample

  • Tune basketball physics: Modify the physics parameters in BallShooter.kt to change bounce behavior, ball size, or velocity. Try lower restitution values for less bounce or larger dimensions for a bigger ball.
  • Visualize collision meshes: Provide a material instead of null in AnchorProceduralMeshConfig to make collision surfaces visible. This helps when debugging room alignment.
  • Add collision sounds: Listen for physics collision events and play spatial audio when a ball hits a surface or target. See the SpatialVideoSample for spatial audio patterns.