Develop

Virtual Camera Publisher sample overview

Updated: May 7, 2026

Overview

This sample demonstrates how to publish custom virtual cameras from a Unity app to Horizon OS, enabling users to record, cast, and livestream from your app’s cameras through the system camera interface. The sample shows how to register multiple cameras, manage capture state, provide thumbnails, and optimize rendering for different resolutions and frame rates. This feature is experimental (package version 0.0.1) and is provided on an as-is basis.

Learning objectives

Complete this guide to learn how to:
  • register and configure virtual cameras with the Horizon OS camera system
  • implement the observer pattern to respond to capture state changes
  • use a single physical Unity Camera to support multiple virtual cameras at different resolutions and frame rates
  • manage camera rendering with frame-rate throttling for performance optimization
  • configure Android manifest permissions and Horizon OS SDK version requirements

Requirements

  • Meta Quest device running Horizon OS v81 or later
  • Unity 6
For complete build and configuration requirements, see the sample README.

Get started

Clone or download the sample from the repository. Open the project in Unity 6, navigate to VirtualCameraPublisherSample/Assets/Scenes/SampleScene.unity, and build the APK for your Meta Quest device. This sample does not support Link or Editor testing. You must deploy a full APK to test virtual camera functionality.

Explore the sample

File / DirectoryWhat it demonstratesKey concepts
Public static API for camera registration and lifecycle management
Centralized manager pattern, frame-by-frame update model
Observer interface for capture state callbacks
Decoupled event notification, capture lifecycle
Internal per-camera state management
Double-buffered RenderTextures, frame-rate throttling
Sample MonoBehaviour showing registration, updates, and cleanup
Multiple cameras from one physical Camera, observer implementation

Runtime behavior

When you run this scene, you see a static environment with floating objects. The sample registers three virtual cameras: two at different resolutions and frame rates sharing a far-view Camera, and one close-up Camera at 1080p. To access these cameras, press the Meta button, open the camera app, navigate to Camera Settings → Camera View, and select one of the registered cameras.

Key concepts

Registering multiple cameras from a single Unity Camera

The sample shows how one physical Unity Camera can back multiple virtual cameras at different resolutions and frame rates:
VirtualCameraManager.RegisterVirtualCamera(
    "1440p Far @ 45fps", cameraId, 2560, 1440, 45, droneCamera, far);
VirtualCameraManager.RegisterVirtualCamera(
    "4k Far @ 60fps", cameraId + 1, 3840, 2160, 60, droneCamera, far);
This pattern allows developers to offer quality/performance trade-offs without duplicating camera GameObjects.

Frame-based update model

Virtual camera rendering happens inside VirtualCameraManager.Update(), which must be called every frame from a MonoBehaviour. The manager handles frame-rate throttling internally, rendering each camera only when its target frame interval has elapsed. This decouples your app’s frame rate from the virtual camera feed rates, allowing 30fps, 45fps, and 60fps cameras to coexist.

Observer pattern for capture state

The sample implements IVirtualCameraObserver to receive callbacks when capture starts or stops:
public void OnCaptureStarted(int id) {
    Debug.Log($"Capture started for camera {id}");
}
This pattern allows you to pause expensive app logic, reposition cameras, or adjust quality settings when a camera feed is actively consumed.

Android manifest configuration

The sample requires three manifest additions: the Horizon OS namespace on the root element, the horizonos.permission.CREATE_VIRTUAL_CAMERA permission, and the uses-horizonos-sdk element declaring minimum and target SDK versions.

Extend the sample

  • Add interactive camera controls to let users switch between cameras or adjust field of view during capture.
  • Implement dynamic thumbnail generation by capturing a frame and encoding it with Texture2D.EncodeToJPG().
  • Use the capture state observer to toggle expensive rendering features when a camera feed is active.