Develop

Spatial Video sample

Updated: May 8, 2026

Overview

This sample demonstrates how to build an immersive stereo video player with directional spatial audio using the Meta Spatial SDK. It covers VR and mixed reality mode switching, multiple panel registration approaches, and custom mesh rendering with ECS-based audio spatialization.

Learning objectives

Complete this guide to learn how to:
  • Create a stereo video player with custom mesh rendering and shaders
  • Implement directional spatial audio using the Entity Component System
  • Coordinate VR and mixed reality mode transitions across multiple subsystems
  • Register panels using Compose activities, XML layouts, and custom PanelSceneObject instances
  • Inject custom audio processors into ExoPlayer for real-time channel mixing

Requirements

Device: Meta Quest device running Horizon OS Development environment: Android Studio with Meta Spatial SDK
For complete setup instructions, see Getting started with Spatial SDK.

Get started

Clone or download the Meta Spatial SDK Samples repository from GitHub. Open the SpatialVideoSample project in Android Studio. Build the APK and deploy to your Meta Quest device using ADB or Meta Quest Developer Hub.
For detailed build and deployment steps, refer to the sample’s README.

Explore the sample

The sample project structure demonstrates different panel types, spatial audio implementation, and MR/VR mode coordination.
File / SceneWhat it demonstratesKey concepts
SpatialVideoSampleActivity.kt
Main VR activity managing video playback, panel creation, and mode switching
AppSystemActivity, ExoPlayer setup, custom mesh/shader video panel, MR/VR state coordination
SpatialAudioSystem.kt
Frame-by-frame spatial audio processing
Custom SystemBase, ECS queries for head tracking, stereo panning via channel mixing, distance-based attenuation
MoviePanel.kt
Video library browser with thumbnails
Jetpack Compose ActivityPanelRegistration, stereo thumbnail extraction (left half crop), cross-activity communication
MRPanel.kt
Passthrough mode toggle interface
Jetpack Compose IntentPanelRegistration, MR mode switching via activity callback
res/layout/controls.xml
Playback controls with buttons and seek bar
XML-based LayoutXMLPanelRegistration, standard Android views in spatial UI
spatial_video_sample_components.xml
Marker component for spatialized panels
Custom component definition, ECS tagging pattern

Runtime behavior

When you run this sample, you see a large video panel displaying stereo content in a 3D environment with a skydome and furnished room. A video library panel floats nearby, showing thumbnails of four bundled videos in a grid layout. Click a thumbnail to switch videos on the main panel. Playback controls appear below the video panel with Back, Play/Pause, and Forward buttons plus a seek bar. Turn your head left or right and the audio pans to match the panel’s direction relative to your head position. Press the MR toggle to switch into mixed reality mode: the environment and skydome disappear, passthrough activates, and you can grab the video panel to reposition it in your physical space.

Key concepts

Stereo video display with custom rendering

The sample bypasses standard panel registration and manually constructs a PanelSceneObject with custom mesh geometry. Notice how MediaPanelRenderOptions specifies StereoMode.LeftRight with a 3840x1080 resolution for side-by-side eye images. ExoPlayer renders directly to the panel surface via panelSceneObject.getSurface(). See createVideoPanel() in SpatialVideoSampleActivity.kt for the mesh construction and shader assignment.

Spatial audio through the Entity Component System

The sample implements directional audio using a custom SpatialAudioSystem that runs every frame. The system queries for the head entity via AvatarAttachment, calculates the direction to each SpatializedAudioPanel entity, and transforms it into head-local space to compute a panning angle. Audio levels follow a cosine-squared panning law that smoothly distributes sound between left and right channels based on angle. These gain values feed into a ChannelMixingAudioProcessor injected through a CustomRenderersFactory. See SpatialAudioSystem.kt for the computation logic.

Coordinated VR and mixed reality mode switching

Toggling between VR and MR modes requires synchronized changes across multiple scene elements. The sample coordinates passthrough activation, environment visibility, grabbable state, locomotion, and controller visibility in a single setMrMode() method. It also adjusts video panel position, scale, and control placement for each mode. See setMrMode() in SpatialVideoSampleActivity.kt for the complete transition logic.

Panel registration patterns

The sample demonstrates multiple approaches to panel creation. The video library uses ActivityPanelRegistration with a Compose activity and ViewModel. The MR toggle uses IntentPanelRegistration with a simpler Compose activity. The playback controls use LayoutXMLPanelRegistration with a standard Android XML layout. Each approach serves different complexity needs. See registerPanels() in SpatialVideoSampleActivity.kt for all registration configurations.

Cross-activity communication

Panels run in separate activity instances and communicate with the main VR activity through SpatialActivityManager. Notice how MoviePanel and MRPanel execute callbacks on the main activity using SpatialActivityManager.executeOnVrActivity<SpatialVideoSampleActivity> { activity -> ... }. This pattern safely bridges the process boundary between panel activities and the VR session, allowing video selection and mode toggles to trigger state changes in the main scene. See MoviePanel.kt and MRPanel.kt for usage examples.

Extend the sample

  • Add audio effects: Modify the spatial audio system to include reverb or occlusion based on the panel’s distance or orientation.
  • Try different stereo formats: Change StereoMode to top-bottom layout or adjust the resolution for different video aspect ratios.
  • Add gesture controls: Use Interaction SDK hand tracking to add volume or playback speed gestures, building on the existing IsdkGrabbable and IsdkPanelGrabHandle components.
For related patterns, explore the MediaPlayerSample for additional video playback and panel techniques and the MixedRealitySample for more passthrough and scene anchoring patterns.