Develop

Premium Media sample overview

Updated: May 7, 2026

Overview

The PremiumMediaSample showcases advanced media streaming and spatial integration techniques for Meta Quest headsets. It demonstrates DRM-protected video playback, custom shaders that cast screen light onto room surfaces, multi-process Jetpack Compose panels, and MRUK-based wall snapping.

What you will learn

  • Stream Widevine DRM-protected content using ExoPlayer with direct-to-surface rendering
  • Create custom GLSL shaders that sample panel textures and project ambient lighting onto MRUK surfaces
  • Snap entities to walls, floors, and ceilings detected by MRUK
  • Display 180-degree equirectangular video in immersive hemispherical panels
  • Run Jetpack Compose panels in separate processes for better performance

Requirements

  • Meta Quest 2, Quest 3, or Quest 3S running Horizon OS
  • Android development environment with Android Studio
For detailed SDK version requirements and build prerequisites, see the sample README.

Get started

Clone or download the Meta Spatial SDK Samples repository from GitHub. Open the PremiumMediaSample project in Android Studio, connect your Quest device via USB or Wi-Fi, and run the app. When launched, the app requests Scene permission to load your room layout, then displays a home screen where you can select media to play.
For detailed build instructions, see the sample README.

Explore the sample

The sample uses a runtime-constructed scene with no GLXF files. All entities, panels, and lighting meshes are created programmatically:
File / SceneWhat it demonstratesKey concepts
Main entry point; registers all systems, components, and features
Activity lifecycle, feature registration, IPC message handling
MRUK initialization and scene loading
MRUKFeature setup, scene permission handling, loadSceneFromDevice()
Orchestrates ExoPlayer, panels, cinema states
Central state management, entity lifecycle coordination
Creates video panels with spatial audio integration
Readable vs direct-to-surface rendering modes, VideoSurfacePanelRegistration
ExoPlayer factory with DRM support
OculusMediaCodecVideoRenderer, Widevine DRM session manager, DASH sources
MRUK-based wall snapping
Gaze ray-casting, grab repositioning, plane intersection
Extracts panel texture and distributes to shaders
SceneTexture detection, shader uniform distribution
Creates lighting meshes aligned to MRUK surfaces
MRUK plane quad generation, custom material assignment
heroLighting.glsl
Core lighting shader math
Rect-space transforms, light falloff, floor reflections, SDF shadows
Content selection UI (separate process)
Multi-process Compose, IPC messaging, ActivityPanelRegistration
Media controls UI (separate process)
Cross-process state updates, custom Compose components

Runtime behavior

When you run this sample, you see a home screen panel displaying a carousel of media thumbnails. Select an item to start playback. For rectilinear content (standard flat video), a control panel appears below the video with play/pause, seek controls, passthrough and lighting sliders, and Cinema/TV mode buttons. Cinema mode creates a large virtual theater screen with dimmed passthrough, while TV mode displays a smaller screen with visible room lighting effects. The video screen casts ambient light onto nearby walls, floors, and ceilings, creating soft reflections on floor surfaces. For 180-degree content, the video wraps around you in a hemispherical view with detached controls.

Key concepts

DRM-protected streaming

The sample uses a custom OculusMediaCodecVideoRenderer to enable Widevine DRM playback. When a DRM license URL is provided, or when using non-rectilinear video shapes like 180° equirect, the sample forces direct-to-surface rendering:
val panelRenderingStyle =
    if (drmEnabled || mediaSource.videoShape != MediaSource.VideoShape.Rectilinear)
        PanelRenderingStyle.DIRECT_TO_SURFACE
    else PanelRenderingStyle.READABLE
See ExoPlayerExt.kt for the complete DRM session manager configuration. For API reference, see DRM content.

Hero lighting shaders

The hero lighting system samples the video panel texture and projects it onto surrounding room surfaces as ambient lighting. The sample distributes screen data through shader uniforms rather than direct texture binding:
val emissiveFactor = Vector4(screenPos.x, screenPos.y, screenPos.z, width)
val albedoFactor = Vector4(rotEuler.x, rotEuler.y, rotEuler.z, height)
The fragment shader transforms world-space positions into “rect space” relative to the panel, samples the texture at high mip levels for soft blurring, and applies distance-based falloff. See HeroLightingSystem.kt and heroLighting.glsl for implementation details. For custom shader documentation, see Custom shaders.

MRUK wall snapping

The sample implements two snapping modes: initial placement via gaze ray-casting and runtime repositioning during grab. The initial snap uses a custom ray-plane intersection utility:
val headToAnchorableDirection = (anchorablePose.t - headPose.t).normalize()
val hitInfo = doesRayIntersectPlane(headPose.t, headToAnchorableDirection, plane)
During grab, the system continuously updates rotation to face outward from the wall using slerp for smooth animation. See AnchorSnappingSystem.kt for the complete snapping logic. For MRUK API reference, see MRUK.

Multi-process Compose panels

The sample runs each Compose panel activity in a separate Android process for better performance:
<activity android:name=".panels.homePanel.HomePanelActivity"
    android:process=":home_panel" />
The panels communicate with the immersive activity through a bound IPC service using the Android Messenger pattern. This approach reduces UI thread contention but adds IPC overhead. See IPCService.kt for the message routing implementation. For panel documentation, see Direct-to-surface rendering.

Extend the sample

  • Replace media catalog: The media catalog in HomePanelViewModel.kt defines the DASH URLs — replace these with your own streaming sources or add new entries with different stereo modes.
  • Adjust lighting: The hero lighting shader in heroLighting.glsl exposes parameters for light falloff distance and floor reflection intensity — adjust these to match different room sizes and content types.
  • Add cinema states: The CinemaStateHandler.kt state machine supports adding new cinema states for different viewing modes, such as floor seating or dome projection.