This sample demonstrates how to build a complete mixed reality game that integrates the player’s physical room into gameplay using Meta’s Presence Platform features. Project Phanto is a ghost-themed game where virtual enemies navigate, collide with, and hide behind real-world furniture scanned by the Quest headset. Unity developers can use this sample to learn scene mesh loading, runtime NavMesh generation, semantic scene queries, depth-based occlusion, and Haptics SDK integration.
Note: Some features (scene mesh, high-resolution color passthrough, depth occlusion) are Quest 3, or Quest 3S only. The sample provides fallback implementations for other devices.
Load and configure scene mesh data from the Scene API using MR Utility Kit (MRUK)
Generate runtime NavMesh surfaces from physical room geometry for ground-based AI navigation
Query furniture semantically by label (TABLE, COUCH, SCREEN) for spawn points and gameplay targets
Implement depth-based occlusion so virtual objects hide behind real-world surfaces
Integrate Haptics SDK for dynamic controller feedback with runtime amplitude and frequency modulation
Requirements
Hardware: Meta Quest 3, or Quest 3S, or Quest 2
Development environment: Unity 6 (6000.0.59f2) with Meta XR SDK v83.0.0
For complete build prerequisites, SDK installation, and project setup, see the sample README.
Get started
Clone the repository from github.com/oculus-samples/Unity-Phanto and open the project in Unity 6. The entry point is Assets/Phanto/Scenes/LobbyScene.unity. Build and deploy to your Quest device using Unity’s build settings for Android. The lobby scene requests Scene API permissions, loads your room data, and displays a mesh preview before launching the tutorial or game.
Explore the sample
Project Phanto is organized into three main scenes and ten standalone example scenes. The main scenes demonstrate integrated gameplay, while the example scenes isolate individual technical concepts.
Main scenes
Scene
File path
What it demonstrates
Key concepts
LobbyScene
Assets/Phanto/Scenes/LobbyScene.unity
Permission handling, scene data loading, scene preview UI
When you run LobbyScene, the app requests Scene API permissions, loads your room’s mesh data via MRUK, and displays a wireframe preview of the scanned space. Select Tutorial to learn the mechanics or Game to start the wave-based challenge.
In GameScene, the gameplay alternates between Phanto waves and Phantom waves. During Phanto waves, the flying ghost boss (Phanto) navigates around your room in 3D space, spits goo balls at walls, and casts area-of-effect goo novas. Phantoms spawn from goo puddles left by goo ball impacts and novas. You spray Phanto with the Polterblast 3000 handheld weapon to reduce its health. During Phantom waves, crystals spawn on furniture surfaces (tables, couches). Ground-based Phantom ghosts navigate using NavMesh and attack the crystal. You place Ecto Blaster turrets on surfaces to defend the crystal.
In the standalone HapticsDemo example scene, Phanto floats stationary in the center of your room. Squeeze the grip button to trigger a continuous haptic effect. The trigger button controls amplitude (intensity), and the thumbstick controls frequency (pitch).
Key concepts
Scene mesh loading with MRUK
The sample uses MR Utility Kit (MRUK) to load scene data from the Scene API. SceneDataLoader instantiates the MRUK prefab and calls LoadSceneFromDevice() to retrieve the room mesh and furniture anchors. After loading, the mesh layer is set to “GlobalMesh” for selective raycasting and collision detection. The sample supports three data source modes: SceneApi (production), StaticMeshDataPrefab (development), and StaticMeshDataJson (development).
See SceneDataLoader.cs and PhantoSceneMesh.cs for the complete implementation. For MRUK API details, see the MR Utility Kit documentation.
Two-tier NavMesh generation
The sample generates two separate NavMesh surfaces at runtime. NavMeshGenerator bakes a floor NavMesh from the room boundary with area ID FloorArea (3). FurnitureNavMeshGenerator bakes per-furniture NavMeshSurfaces for walkable furniture (TABLE, COUCH, OTHER, STORAGE, BED) with area ID FurnitureArea (4). NavMeshLinks connect furniture edges to floor points, allowing Phantom enemies to hop between surfaces. Doorway links connect adjacent rooms for multi-room navigation.
See NavMeshGenerator.cs and FurnitureNavMeshGenerator.cs for the baking logic. For Unity NavMesh API, see the AI Navigation package documentation.
Air navigation without NavMesh
Phanto (the flying ghost boss) uses Rigidbody-based hovering instead of NavMesh. Enemy.HoverAround() and Enemy.MoveAlong() adjust velocity to maintain altitude and move through 3D space. OnCollisionStay with the scene mesh redirects movement when Phanto collides with walls or furniture. PhantoBehaviour implements an 8-state machine (Roam, SpitGooBall, Nova, Pain, Dodge, GoEthereal, Die, DemoRoam) for combat behavior.
See PhantoBehaviour.cs for the state machine implementation.
Semantic scene queries
SceneQuery provides furniture lookup by semantic label using MRUKAnchor.SceneLabels. TryGetClosestSemanticClassification(point, normal, out MRUKAnchor) finds the nearest furniture of a given type. RandomPointOnFloor() and RandomPointOnFurniture() generate spawn positions. Phantom enemies use semantic queries to select target furniture (tables for crystals, lamps for ranged attacks) and display thought bubble icons matching the furniture type.
See SceneQuery.cs and ThoughtBubbleController.cs for query patterns. For semantic label reference, see the Scene API documentation.
Content placement via raycasting
EctoBlasterSpawner raycasts against the GlobalMesh layer to place turrets on scene surfaces. The spawn position uses hit.point and aligns the object to the surface normal with transform.up = hit.normal. A preview indicator displays at the raycast hit point before placement, with a range ring showing the turret’s coverage area.
See EctoBlasterSpawner.cs for the placement logic.
Depth-based occlusion
OcclusionController uses EnvironmentDepthManager from the Depth API to render virtual objects behind real-world surfaces. EnvironmentDepthManager.IsSupported gates the feature at runtime, so depth occlusion is only active on Quest 3, or Quest 3S. The sample supports hard occlusion (binary visibility) and soft occlusion (smooth blending) via shader keywords. OcclusionController applies the appropriate occlusion shader per renderer at runtime, and the sample mixes both modes across different game elements for optimal performance.
See OcclusionDemoController.cs and OcclusionController.cs for shader configuration. For Depth API details, see the Depth API documentation.
Haptics SDK integration
The sample demonstrates three haptics integration patterns. HapticsDemo uses HapticClipPlayer with isLooping = true and modulates amplitude and frequencyShift at runtime based on controller input. PolterblastTrigger uses HapticCollection to select named clips for gameplay events (spray start, spray loop, spray end) with amplitude scaling based on spray intensity. LobbyManager uses HapticCollection.TryGetPlayer(clipName) for UI button feedback.
See HapticsDemoController.cs and PolterblastTrigger.cs for integration patterns. For Haptics SDK API, see the Haptics SDK documentation.
Extend the sample
Add new enemy behaviors: Modify PhantomBehaviour to introduce new movement patterns (wall climbing, ceiling patrol) using semantic queries to identify WALL_ART or other vertical surfaces. Use the existing state machine pattern as a template
Implement room-scale spawn strategies: Extend SceneQuery with area-based spawn logic that distributes enemies across multiple rooms proportionally to room size. Use MRUKRoom.FloorArea to weight spawn probabilities
Create custom wave configurations: Define new WaveSettings ScriptableObjects in GameplaySettings with mixed enemy types, custom win conditions, and time limits. Use the existing wave system in GameplayManager as the framework
For related samples demonstrating similar concepts, see the Unity Discover sample (semantic placement), Unity DepthAPI sample (occlusion patterns), and Unity LocalMultiplayerMR sample (shared space coordination).