MRUK) to build spatial experiences that interact with physical surroundings through four experiences: anchor mesh spawning, keyboard tracking, QR code scanning, and raycasting. It is designed for developers who want to learn MRUK fundamentals including scene loading, procedural mesh generation, object tracking, and ray-based interaction.configureTrackers(), which is currently marked as experimental and may change in future SDK versions.MRUKFeatureAnchorMeshSpawnerAnchorProceduralMeshconfigureTrackers() with the experimental tracker APIMRUK raycast APIsMrukSample project in Android Studio. Build and deploy the app to your device. The sample launches with a 2D menu presenting four experiences as buttons. Select an experience from the start menu to explore a different MRUK capability.| File / Scene | What it demonstrates | Implementation notes |
|---|---|---|
MrukSampleStartMenuActivity.kt | 2D Compose launcher menu | Standard Android ComponentActivity with SpatialTheme UI, launches immersive experiences via intent |
anchor_mesh/MrukAnchorMeshSampleActivity.kt | Replacing physical objects with virtual furniture | AnchorMeshSpawner maps MRUKLabel values to .glb assets, AnchorProceduralMesh generates textured meshes for surfaces, MRUKSceneEventListener callbacks handle room and anchor events |
keyboard_tracker/KeyboardTrackerSampleActivity.kt | Tracking physical keyboards | configureTrackers(setOf(Tracker.Keyboard)) starts tracking, TrackedKeyboard component added automatically, custom system queries tracked entities and spawns passthrough cutout meshes |
qr_code_scanner/QrCodeScannerSampleActivity.kt | Scanning QR codes and spawning panels | configureTrackers(setOf(Tracker.QrCode)) starts tracking, TrackedQrCode.getPayloadAsString() retrieves QR data, WebView panels display content at QR code positions |
raycast/RaycastSampleActivity.kt | Raycasting against scene and depth | Four raycast modes (single, all, global mesh, depth), raycastRoom() and raycastRoomAll() for scene planes, raycastEnvironment() for depth data, arrow entities visualize hit points |
common/MrukInputSystem.kt | Controller input handling | Queries AvatarBody to get controller entity, reads ButtonBits.ButtonMenu state for UI panel toggling |
common/MrukSampleUtils.kt | HMD and controller access | PlayerBodyAttachmentSystem.tryGetLocalPlayerAvatarBody() pattern for accessing head and hand entities |
USE_SCENE permission on launch and loads scene data from the device, spawning 3D furniture models at detected anchor positions (tables, couches, lamps, plants). Floor, walls, and ceiling receive procedural meshes with carpet and wall textures. A UI panel accessed via the left controller Menu button provides controls for loading different JSON room layouts, toggling global mesh visualization, launching scene capture, and saving scene data to a file.startEnvironmentRaycaster() to enable real-time depth-based raycasting.MRUK supports loading scene data from the device or from JSON files. The anchor mesh and raycast experiences demonstrate both approaches.// Returns CompletableFuture<MRUKLoadDeviceResult> mrukFeature.loadSceneFromDevice()
mrukFeature.loadSceneFromJsonString(jsonText)
CompletableFuture that completes when loading finishes. The anchor mesh experience includes UI controls for selecting scene models (V1, V2, V2 with fallback) and loading from 12 different pre-configured JSON room layouts.AnchorMeshSpawner replaces detected physical objects with 3D models. Pass a map of MRUKLabel values to .glb file paths:val spawner = AnchorMeshSpawner(mrukFeature, mutableMapOf(
MRUKLabel.TABLE to AnchorMeshSpawner.AnchorMeshGroup(listOf("Furniture/Table.glb")),
MRUKLabel.COUCH to AnchorMeshSpawner.AnchorMeshGroup(listOf("Furniture/Couch.glb"))
))
AnchorProceduralMesh creates textured geometry for surfaces like floors, walls, and ceilings:AnchorProceduralMesh(
mrukFeature,
mapOf(
MRUKLabel.FLOOR to AnchorProceduralMeshConfig(carpetMaterial, true),
MRUKLabel.WALL_FACE to AnchorProceduralMeshConfig(wallMaterial, true)
)
)
mrukFeature.configureTrackers(setOf(Tracker.Keyboard))
CompletableFuture<MRUKStartTrackerResult>. When a tracked object is detected, the SDK automatically creates an entity with a TrackedKeyboard or TrackedQrCode component. The keyboard tracker experience queries for these entities and spawns custom meshes:Query.where { has(TrackedKeyboard.id) }.eval()
qrCodeEntity.getComponent<TrackedQrCode>().getPayloadAsString()
MRUK provides three raycast methods: raycastRoom() for single hits, raycastRoomAll() for all hits along a ray, and raycastEnvironment() for depth-based raycasts. The raycast experience demonstrates all three modes plus global mesh raycasting.val hit = mrukFeature.raycastRoom(
roomUuid, origin, direction, maxDistance, SurfaceType.PLANE_VOLUME)
mrukFeature.startEnvironmentRaycaster() val result = mrukFeature.raycastEnvironment(origin, direction)
UpdateRaycastSystem to compute rays from the right controller pose and visualize hit points with arrow entities. See UpdateRaycastSystem.kt for the hit visualization logic.MRUK features: Integrate furniture spawning with raycasting to allow users to place objects at raycast hit points. Use the anchor mesh experience as a base and add controller-driven placement logic from the raycast experience.MRUKSceneEventListener interface to display real-time notifications when rooms or anchors are added, removed, or updated. Use the anchor mesh experience’s listener implementation as a reference.