Develop

Discover sample overview

Updated: May 11, 2026

Overview

This sample demonstrates how to build a complete networked mixed reality experience that combines Scene API, Interaction SDK, passthrough, spatial anchors, shared spatial anchors, and Meta Avatars into a single cohesive application. The Discover project serves as a reference implementation, showing you how to create a multi-user MR launcher where players can place, share, and launch networked applications that interact with their physical room.

What you will learn

  • Combine Scene API room loading with passthrough rendering to create MR experiences anchored to real-world geometry
  • Implement shared spatial anchors for colocation, allowing multiple players in the same physical room to see aligned content
  • Architect a networked application launcher using Photon Fusion that manages app lifecycle, placement, and persistence
  • Integrate Meta Avatars with Photon Voice for social presence in networked MR experiences
  • Use Interaction SDK’s ray, poke, and hand-grab interactors to support both controller and hand-tracking input

Requirements

  • Meta Quest device
  • Unity 6000.0.59f2 or newer
  • Photon account (Fusion + Voice, free tier available)
  • Meta Developer Dashboard app configured for platform services
Important: Enable Cloud Storage in the Meta Developer Dashboard for your app. Spatial anchor persistence and colocation features require Cloud Storage to share and save anchors across sessions.
For detailed SDK versions, build dependencies, and setup steps, see the Unity Discover README.

Get started

Clone the repository using Git LFS, then open the project in Unity 6000.0.59f2 or newer. Before building to device, configure your Photon App ID and Photon Voice App ID in the project settings, and set up your Meta app in the Developer Dashboard with platform services enabled. Build the project as a release channel upload to enable shared spatial anchor colocation features. For complete build instructions and configuration steps, see the Unity Discover README.

Explore the sample

The project includes a main Discover scene that demonstrates the full application, four example scenes that isolate individual features, and two built-in mini-applications. Three custom packages provide reusable utilities for input, avatars, and common Unity patterns.
File / SceneWhat it demonstratesKey concepts
Assets/Discover/Scenes/Discover.unity
Complete networked MR launcher with room loading, icon placement, app management, and colocation
Single entry-point architecture, network lifecycle, spatial anchor persistence
Assets/Discover/Scenes/Examples/Colocation.unity
Isolated colocation flow between players in the same room
Shared spatial anchors, camera realignment, network messaging
Assets/Discover/Scenes/Examples/RoomMapping.unity
Scene API room loading and element generation from mapped room data
MRUK room loading, anchor labels, scene element registry
Assets/Discover/Scenes/Examples/SimpleMRScene.unity
Minimal setup for passthrough and MR application launch
Passthrough rendering, basic MR scene structure
Assets/Discover/Scenes/Examples/StartupExample.unity
Platform initialization and user data retrieval
Oculus Platform SDK, entitlement validation, user profiles
Assets/Discover/Scripts/AppStartup.cs
Entry point with platform initialization and validation
OculusPlatformUtils.InitializeAndValidate()
Assets/Discover/Scripts/DiscoverAppController.cs
Central controller managing full app lifecycle from NUX to room setup
INetworkRunnerCallbacks, NetworkRunner, GameMode.Shared
Assets/Discover/Scripts/MRSceneLoader.cs
Room loading via Scene API on device or fake room in editor
MRUK.Instance.LoadSceneFromDevice(), MRUKRoom
Assets/Discover/Scripts/AppsManager.cs
Icon placement, movement, spatial anchor persistence, app launching
OVRSpatialAnchor, NetworkRunner.Spawn()
Assets/Discover/Scripts/Colocation/ColocationDriverNetObj.cs
Colocation flow driver for shared anchor creation and joining
SharedAnchorManager, AutomaticColocationLauncher
Assets/Discover/Scripts/SpatialAnchors/SpatialAnchorManager.cs
Generic manager for local spatial anchor save/load/erase with file persistence
OVRSpatialAnchor.SaveAnchorAsync(), LoadUnboundAnchorsAsync()
Assets/Discover/Scripts/DiscoverPlayer.cs
Networked player representation with avatar, voice, and profile
AvatarEntity, VoiceNetworkObject
Assets/Discover/DroneRage/
Wave-based MR drone shooter using room geometry
Networked game loop, enemy spawning, room material swapping
Assets/MRBike/
Guided bicycle assembly experience with task tracking
Networked part visibility, assembly logic, task sync via RPCs
Packages/com.meta.utilities/
General utilities including [AutoSet], Singleton<T>, extension methods
Reusable Unity patterns
Packages/com.meta.utilities.input/
Input system utilities for XR devices and editor simulation
XRInputManager, XRDeviceFpsSimulator, Interaction SDK data sources
Packages/com.meta.utilities.avatars/
Meta Avatars SDK integration with body tracking and joint events
AvatarEntity, avatar mesh utilities

Runtime behavior

When you run the main Discover scene, you first see platform initialization and entitlement validation. First-time users see a multi-page network tutorial explaining how the experience works. You then see a modal with four connection options: Host, Join (colocated), Join Remote, or Single Player.
If you choose Host, the sample loads your physical room via Scene API and starts a Photon Fusion session in shared mode. The room’s walls, floor, and furniture appear as networked scene elements, and the sample creates a shared spatial anchor space for colocation. The main menu appears when you press the menu button, showing a grid of application tiles. Clicking an unplaced tile activates placement mode, where a ray follows your room’s surfaces and validates placement against surface types. After confirming placement, the sample spawns a 3D icon in the room and saves its position to a spatial anchor with JSON persistence.
If another player chooses Join (colocated) in the same physical room, they connect to your session and automatically align their view to your shared spatial anchor. Both players see the same 3D icons in the same physical locations. Any player can click a placed icon or menu tile to launch an application. The sample spawns the networked application container, and only one app runs at a time. In DroneRage, waves of enemy drones spawn around the room and you wield dual weapons to fight them off, with the room’s physical geometry determining spawn locations and navigation bounds. In MRBike, you follow voice-over instructions to grab and assemble bike parts, with task completion synced across all players.
Remote players who choose Join Remote appear as avatars with voice chat but do not share the same spatial alignment as colocated players. The main menu provides tabs for app management, settings (reset tutorials, clear icons, leave session), scene element highlighting, and system info (room name, time, battery).

Key concepts

Single-scene architecture with network lifecycle

The Discover sample uses a single entry-point scene that handles the full application lifecycle from startup through network connection to room setup. Notice how DiscoverAppController implements INetworkRunnerCallbacks to manage state transitions, with distinct phases for platform initialization, network selection, connection, and room loading. This pattern avoids complex scene loading and keeps all network state in a single runner instance.
// DiscoverAppController.cs
public class DiscoverAppController : Singleton<DiscoverAppController>, INetworkRunnerCallbacks
{
    // In Connect(): starts a Photon Fusion session in Shared Mode
    var joined = await Runner.StartGame(new StartGameArgs { GameMode = mode, /* ... */ });
}
See Assets/Discover/Scripts/DiscoverAppController.cs for the complete network lifecycle implementation.

Colocation with shared spatial anchors

The sample demonstrates automatic colocation between players in the same physical room using shared spatial anchors. The host creates a shared anchor space via ColocationDriverNetObj, which wraps the colocation package’s SharedAnchorManager. Clients receive the spawned driver and call ColocateAutomatically(), which handles anchor sharing, localization, and camera realignment without requiring manual configuration. Notice how AlignCameraToAnchorManager re-aligns the camera on recenter and HMD mount events to maintain alignment as players move.
See Assets/Discover/Scripts/Colocation/ColocationDriverNetObj.cs and the Shared Spatial Anchors documentation.

Spatial anchor persistence with generic manager

The sample uses a generic SpatialAnchorManager<T> to save, load, and erase spatial anchors with JSON file persistence. This pattern decouples anchor logic from specific data types, allowing you to persist different kinds of anchored objects (app icons, scene markers, game objects) using the same manager. Notice how AnchorJsonFileManager<T> handles serialization and OVRSpatialAnchor.SaveAnchorAsync() and LoadUnboundAnchorsAsync() provide the core anchor lifecycle.
See Assets/Discover/Scripts/SpatialAnchors/SpatialAnchorManager.cs and the Spatial Anchors documentation.

Networked application management

The sample demonstrates a networked application launcher where only one app runs at a time across all connected players. NetworkApplicationManager is a network singleton that spawns and despawns NetworkApplicationContainer instances via Runner.Spawn(). Each container tracks its spawned objects and handles cleanup on close. This architecture ensures consistent app state across all clients and prevents conflicts when multiple players try to launch apps simultaneously.
See Assets/Discover/Scripts/NetworkApplicationManager.cs and the Photon Fusion documentation.

Avatar streaming with Photon Fusion

The sample shows how to stream Meta Avatar data over Photon Fusion using RPC-based byte array transfer. PhotonFusionAvatarNetworking calls OvrAvatarEntity.RecordStreamData_AutoBuffer() to capture avatar state, sends it via RPC, and applies it on remote clients using ApplyStreamData(). This pattern works around Photon Fusion’s lack of native byte array support while maintaining efficient avatar synchronization.
See Assets/Discover/Scripts/Networking/PhotonFusionAvatarNetworking.cs and the Meta Avatars SDK documentation.

Extend the sample

  • Modify the DroneRage game to spawn enemies based on specific room labels (e.g., only spawn from walls labeled WALL_FACE) to demonstrate Scene API label-based queries
  • Add a third mini-application that uses the NetworkGrabbableObject pattern to create a collaborative object manipulation experience
  • Explore the custom packages to see how [AutoSet] and Singleton<T> reduce boilerplate in your own projects
For more advanced networking patterns, see the Photon Fusion samples. For Scene API room manipulation, see the Scene API documentation.