Develop

Explore Meta OpenXR SDK samples

Updated: May 7, 2026

Overview

The Meta OpenXR SDK provides 19 native C/C++ sample applications demonstrating Meta-specific and experimental OpenXR extensions for Quest headsets. Each sample shows how to enable and use extensions for hand tracking, passthrough, scene understanding, spatial anchors, body/face/eye tracking, dynamic object detection, and advanced rendering. All samples share an application framework that handles OpenXR session management and rendering so you can focus on extension-specific features.

What you will learn

  • How to build OpenXR applications using the OVRFW::XrApp base class for OpenXR lifecycle, render loop, and input management
  • How to enable Meta-specific OpenXR extensions by overriding GetExtensions(), loading function pointers, and handling extension events
  • Three architectural patterns for OpenXR development: framework-based XrApp subclass (12 samples), standalone lifecycle management (five samples), and single-file C (two samples)
  • How to combine multiple extensions in a single application, as demonstrated by samples like XrDynamicObjects which integrates passthrough, spatial anchors, and dynamic object tracking
  • What preview and experimental extensions Meta provides for features like wide-motion hand tracking, body tracking fidelity, dynamic object detection, and room mesh access

Requirements

  • Hardware: Meta Quest 3 or Quest 3S
  • Development environment: Android Studio with NDK, CMake, and the Android SDK configured
  • Experimental features flag: Enable experimental features on your device with adb shell setprop debug.oculus.experimentalEnabled 1 (resets on reboot)
For complete build system requirements, toolchain versions, and Quest Link setup for Windows development, see the repository README.

Get started

Clone or download the repository from https://github.com/meta-quest/Meta-OpenXR-SDK. You can open the entire sample collection by loading Samples/build.gradle in Android Studio, or work with individual samples by opening a specific sample’s build.gradle file (e.g., Samples/XrSamples/XrInput/Projects/Android/build.gradle). Each sample includes a Gradle wrapper and can be built independently. Connect your Quest device via USB, ensure developer mode is enabled, and build/run directly from Android Studio to deploy to your headset.
For detailed build instructions, dependency versions, and Quest Link configuration for Windows builds, see the repository README.

Explore the samples

The SDK organizes samples thematically to help you navigate by feature area. Each sample demonstrates one or more OpenXR extensions and includes a README with controls and implementation notes.
SampleWhat it demonstratesKey concepts
Input & Interaction
 
 
XrInput
OpenXR action system with controller bindings
Actions, action sets, interaction profiles, suggested bindings
XrControllers
Controller input values and haptic feedback
Haptic amplitude envelope, PCM haptics
XrMicrogestures
Thumb tap and swipe microgestures
Hand-based D-pad input without controllers
Hand Tracking
 
 
XrHandsFB
Hand joint positions with mesh and capsule visualization
XR_EXT_hand_tracking, XR_FB_hand_tracking_mesh, XR_FB_hand_tracking_capsules
XrHandDataSource
Hybrid hand tracking using both camera and controller data
XR_EXT_hand_tracking_data_source for fusing data sources
XrHandsAndControllers
Simultaneous hand and controller tracking
XR_META_simultaneous_hands_and_controllers, XR_META_detached_controllers
XrHandTrackingWideMotionMode
Extended hand tracking range beyond camera FOV
XR_META_hand_tracking_wide_motion_mode
Body, Face, & Eye Tracking
 
 
XrBodyFaceEyeSocial
Combined body, face, and eye tracking for avatars
XR_FB_body_tracking, XR_FB_eye_tracking_social, XR_FB_face_tracking2
Passthrough & Mixed Reality
 
 
XrPassthrough
Passthrough compositing with styles, selective and projected modes
XR_FB_passthrough compositor layers
XrPassthroughOcclusion
Depth-based occlusion of virtual objects behind real ones
XR_META_environment_depth for MR rendering
Scene Understanding
 
 
XrSceneModel
Room setup, querying scene entities (floors, walls, furniture, meshes)
XR_FB_scene_capture, XR_FB_scene, XR_META_spatial_entity_mesh
XrSceneSharing
Share captured scene data between host and guest devices
XR_META_spatial_entity_group_sharing, XR_META_spatial_entity_sharing
Spatial Anchors & Persistence
 
 
XrSpatialAnchor
Create, persist, discover, query, and share spatial anchors
XR_FB_spatial_entity
XrColocationDiscovery
Discover colocated devices and share anchor groups
XR_META_colocation_discovery, anchor group sharing
Dynamic Object Tracking
 
 
XrDynamicObjects
Detect and track physical keyboards via passthrough window
XR_META_dynamic_object_tracker, XR_META_dynamic_object_keyboard
Keyboard Input
 
 
XrVirtualKeyboard
Virtual keyboard with hand/controller input and render model
XR_META_virtual_keyboard, XR_FB_render_model, swipe typing
Rendering & Performance
 
 
XrCompositor_NativeActivity
Compositor layer types (cube, cylinder, equirect, quad)
Single-file C sample demonstrating all compositor layers
XrSpaceWarp
Application space warp for half-framerate rendering
XR_FB_space_warp for performance optimization
XrColorSpaceFB
Color space enumeration and selection
XR_FB_color_space for color management

Runtime behavior

Samples launch in VR mode on your Quest device. Most samples display a 3D environment with interactive UI panels that let you toggle extension features, adjust parameters, and visualize tracking data in real time. For example, XrHandsFB renders your hands as skeletal meshes or capsule visualizations, XrSceneModel overlays detected room boundaries and furniture from the scene model, and XrVirtualKeyboard displays a floating keyboard you interact with using hand or controller input. Standalone samples like XrPassthrough and XrSpatialAnchor focus on a single feature and provide simpler controls via controller buttons.

Key concepts

The XrApp base class pattern

Most samples derive from OVRFW::XrApp, a base class in the SampleXrFramework library that handles OpenXR instance creation, session management, the render loop, swapchain setup, and default controller input. The sample overrides virtual methods to add extension-specific behavior without reimplementing the OpenXR lifecycle:
std::vector<const char*> GetExtensions() override {
    auto exts = XrApp::GetExtensions();
    exts.push_back(XR_FB_COLOR_SPACE_EXTENSION_NAME);
    return exts;
}
The framework aggregates extensions from GetExtensions() and passes them to xrCreateInstance(), then handles xrWaitFrame(), xrBeginFrame(), and xrEndFrame() automatically. Samples override AppInit() for setup, AppRenderFrame() for rendering, and AppHandleEvent() for custom event processing.
For a complete working implementation, see SampleXrFramework/Src/XrApp.h.

Extension registration and function loading

OpenXR extensions follow a standard pattern: add the extension name at instance creation, then load function pointers via xrGetInstanceProcAddr(). The framework handles this for common extensions, but samples that use newer or experimental extensions load additional function pointers in AppInit():
xrGetInstanceProcAddr(
    instance,
    "xrCreateDynamicObjectTrackerMETA",
    reinterpret_cast<PFN_xrVoidFunction*>(&xrCreateDynamicObjectTrackerMETA_));
Once loaded, the function pointers remain valid for the lifetime of the instance. This pattern appears in every sample that uses Meta-specific extensions.
For a complete working implementation, see XrDynamicObjects/Src/main.cpp.

Three architectural patterns

The SDK demonstrates three approaches to OpenXR development, each suited to different needs:
  • XrApp subclass pattern (12 samples): Derived from OVRFW::XrApp and override virtual methods. The framework handles lifecycle, rendering, and input. Use this pattern when you want to focus on extension logic rather than OpenXR lifecycle management.
  • Standalone pattern (five samples): Manage the full OpenXR lifecycle manually by calling xrCreateInstance(), xrCreateSession(), and implementing your own render loop. Samples like XrPassthrough, XrSpatialAnchor, and XrSceneModel use this pattern to demonstrate complete OpenXR usage without framework dependencies.
  • Single-file C pattern (two samples): XrCompositor_NativeActivity and XrSpaceWarp are written in pure C using the native_activity_framework, providing minimal examples without C++ abstractions. These samples show OpenXR usage at its simplest and serve as reference implementations for developers working in C.
For a complete working implementation of the standalone pattern, see XrPassthrough/Src/XrPassthrough.cpp. For the single-file C pattern, see XrCompositor_NativeActivity/Src/XrCompositor_NativeActivity.c.

Compositor layer injection

The framework supports multiple compositor layer types beyond the standard projection layer: quad, cylinder, cube, equirect, and passthrough. Samples inject layers before or after the default projection layer by overriding PreProjectionAddLayer() and PostProjectionAddLayer(). This allows passthrough samples to add passthrough layers behind rendered content, or UI samples to add quad layers in front.
For a complete working implementation, see XrApp.h layer methods.

Extend the sample

  • Combine extensions: Use patterns from multiple samples together. For example, add passthrough from XrPassthrough to a hand tracking sample, or combine scene understanding with spatial anchors for a scene-aware experience.
  • Customize the framework: Override additional XrApp virtual methods like PreProjectionAddLayer() for custom compositor layers, GetSuggestedBindings() for custom input bindings, or AppHandleEvent() for extension-specific event handling.
  • Port to Windows via Quest Link: Several samples include WIN32 branches in their CMakeLists.txt files for Quest Link development. Follow these patterns to enable Windows builds for additional samples, allowing desktop development and testing.