Develop

Media Player sample overview

Updated: May 7, 2026

Overview

This sample demonstrates immersive video playback in a spatial app using both flat YouTube streaming and stereoscopic 360-degree video. It showcases four panel registration types, cross-activity communication patterns, passthrough integration with environment depth, and custom shader effects for scene transitions.

What you will learn

  • Register multiple panel types in one application: XML layout panels, activity-based panels, inline Compose panels, and video surface panels
  • Implement dual video playback pathways using WebView for flat content and ExoPlayer for 360 stereoscopic video
  • Communicate between 2D panel activities and the main VR activity using SpatialActivityManager.executeOnVrActivity
  • Integrate passthrough with environment depth and manage VR/MR state transitions
  • Create custom shaders for environment projection effects and animated transitions
  • Load GLXF scene compositions and extract named nodes for programmatic manipulation

Requirements

  • Meta Quest 3 or Quest 3S (for passthrough and environment depth)
  • Development environment configured for Spatial SDK (see Getting started)
  • Internet connection for streaming YouTube content
See the sample README for build prerequisites and SDK version requirements.

Get started

Clone the Meta Spatial SDK Samples repository and open the MediaPlayerSample project in Android Studio. Connect your Meta Quest device via USB, enable developer mode, and run the sample. When the sample launches, you see a custom 3D room environment with a video catalog panel on the left and a large video screen on the wall.
For detailed build and deployment instructions, see the sample README.

Explore the sample

The sample consists of one main VR activity, one separate panel activity, and a GLXF scene composition editable in Meta Spatial Editor:
File / SceneWhat it demonstratesKey concepts
Main VR activity managing scene, panels, video playback, and state transitions
AppSystemActivity lifecycle, panel registration, GLXF loading, custom shader override, declarative state management
Separate ComponentActivity for video catalog with Jetpack Navigation and ViewModel
ActivityPanelRegistration, Compose navigation (grid/detail views), cross-activity communication via SpatialActivityManager
Inline Compose panel for passthrough toggle
ComposeViewPanelRegistration, passthrough and environment depth APIs
registerPanels() method
Four different panel registration types in one application
LayoutXMLPanelRegistration, ActivityPanelRegistration, ComposeViewPanelRegistration, VideoSurfacePanelRegistration
app/scenes/Composition/
GLXF scene composition with named nodes
Meta Spatial Editor workflow, named node extraction via composition.getNodeByName()
app/src/shaders/
Custom fragment and vertex shaders for transitions and 360 rendering
Material shader override, environment projection effect, skybox alpha wipe

Runtime behavior

When you run this sample, you see a custom 3D room environment called MediaRoom with a large flat video panel mounted on the wall. To the left, a catalog panel displays 10 video thumbnails in a grid (9 YouTube videos and 1 local 360 video). Below the catalog, a small panel shows a passthrough toggle switch. When you select a YouTube video from the catalog, it loads in the flat panel via WebView, and the video content projects onto the room walls as emissive lighting. When you select the “Soloist 3D” video, the room animates out with a transition shader effect and a 360-degree stereoscopic sphere appears around you. ExoPlayer plays the immersive video in top/bottom stereoscopic format. Toggle the passthrough switch to hide the room and skybox, enable camera passthrough with environment depth, and make the video panel grabbable so you can reposition it in your physical space.

Key concepts

Four panel registration types

All four Spatial SDK panel registration types appear in a single registerPanels() method. The video panel uses LayoutXMLPanelRegistration with a WebView defined in XML. The catalog panel uses ActivityPanelRegistration, launching a separate ComponentActivity with Jetpack Compose UI. The passthrough toggle uses ComposeViewPanelRegistration, rendering inline Compose content without a separate activity. The 360 video sphere uses VideoSurfacePanelRegistration with Equirect360ShapeOptions and StereoMode.UpDown for top/bottom stereoscopic rendering.
See MediaPlayerSampleActivity.kt method registerPanels().

Cross-activity communication via SpatialActivityManager

The catalog panel (ListPanel) is a separate ComponentActivity, not embedded in the main VR activity. When a user selects a video in the detail view, the panel communicates with the VR activity using:
SpatialActivityManager.executeOnVrActivity<MediaPlayerSampleActivity> { activity ->
    activity.playVideo(selectedMovie.url)
}
This pattern provides type-safe access to the VR activity instance from any panel activity, so panels can call methods directly without casting. See ListPanel.kt in MovieDetailScreen composable.

Dual video playback pathways

Video routes to different players based on content type. YouTube URLs load in a WebView panel as embedded iframes, suitable for flat web video. Local video URIs load in ExoPlayer and render to a VideoSurfacePanelRegistration configured with Equirect360ShapeOptions for 360-degree playback. The playVideo() method inspects the URL string and sets movieState accordingly.
See MediaPlayerSampleActivity.kt method playVideo() and the movieState setter.

Passthrough with environment depth

The passthrough toggle calls scene.enablePassthrough(state) and scene.enableEnvironmentDepth(state) when the switch changes. Environment depth enables spatial occlusion, so virtual content appears behind real-world objects. The mrState setter hides the room and skybox entities, disables locomotion, and makes the video panel grabbable:
activity.scene.enablePassthrough(state)
activity.scene.enableEnvironmentDepth(state)
activity.mrState = state
See MRPanel.ktMRApp() composable.

Video texture projection onto environment

The video panel’s texture is applied as an emissive texture to the room and plant materials, creating a cinema projection effect. The updateTextures() method obtains the panel’s SceneTexture and sets it on the environment materials along with an occlusion mask that controls where the projection appears.
See MediaPlayerSampleActivity.kt method updateTextures() and app/src/shaders/transition.frag.

GLXF scene loading with named node extraction

The GLXF composition loads from the APK and extracts specific entities by scene graph name:
environment = composition.getNodeByName("Environment").entity
videoPanel = composition.getNodeByName("VideoPanel").entity
This allows the scene to be designed in Meta Spatial Editor with named placeholders that the code references programmatically. See MediaPlayerSampleActivity.kt method loadGLXF().

Extend the sample

  • Add a playlist feature: Queue multiple videos and advance automatically when one finishes, building on the MovieViewModel state management in ListPanel.kt.
  • Integrate spatial audio: Make the flat video panel’s sound emanate from its 3D position — see the SpatialVideoSample for spatial audio patterns.
  • Create a theater mode: Dim the room lighting when video plays, using the transition shader’s customParams to control the reveal mask.