Develop
Develop
Select your platform

OpenXR Support for Meta Quest Headsets

Updated: Jan 23, 2025
OpenXR is a royalty-free open standard from the Khronos Group for developing high-performance VR applications that run on multiple platforms. OpenXR simplifies VR development by enabling developers to reuse code across multiple platforms. This allows developers to create portable code for devices from multiple vendors. To read more about OpenXR, see the Khronos OpenXR webpage.
The OpenXR Mobile SDK includes essential resources to use the OpenXR API for native development of VR apps for Meta Quest and Quest 2. Meta Quest headsets are OpenXR 1.0 adopters.
Download the Oculus OpenXR Mobile SDK from our Downloads page.

OpenXR Developer Documentation

Important: Start with OpenXR core concepts by reviewing Core Concepts and related topics. These topics cover:
You can also study the OpenXR 1.0 Specification at the Khronos Group site. The site offers API reference documentation and a PDF reference guide that provides a detailed overview of the API.

Meta Quest OpenXR Mobile Development

In addition to the documentation from the Khronos Group, this section contains information necessary to develop OpenXR apps for Meta Quest and Quest 2.

Android Manifest Specification

When setting up an OpenXR project for Meta Quest and Quest 2, add the following activity intent filter to AndroidManifest.xml:
<intent-filter>
    <action android:name="android.intent.action.MAIN" />
    <category android:name="org.khronos.openxr.intent.category.IMMERSIVE_HMD" />
    <category android:name="android.intent.category.LAUNCHER" />
</intent-filter>

Android OpenXR Loader

Meta supports the Khronos OpenXR Android Loader. No proprietary loader is needed. All sample apps have already been integrated with the Khronos OpenXR Android Loader.
Note
There are still limitations: * Meta only supports Khronos OpenXR Android Loader 1.0.34 and above. Using a Khronos OpenXR Android Loader below this version will cause your app to crash. * This support is for users with OS versions v62 or later and Quest 1 users. * Apps using the new loader will crash on OS versions prior to v62. * Due to compatibility issues, if your app uses the Khronos OpenXR Android Loader, users with a non-Quest 1 device and an OS below v62 will not be able to see your update on the Meta Store.
To add Standard (Khronos) OpenXR Android Loader in your own project, in the build.gradle, add the following code:
android{
  ...

  buildFeatures {
    prefab true
  }
}
and
dependencies {
  ...

  implementation 'org.khronos.openxr:openxr_loader_for_android:1.0.34'
}
For other platforms, see the OpenXR-SDK official README.

KHR Android Loader Extension

Before loading can occur, the loader must be initialized with platform-specific parameters. These parameters are specified with the KHR loader extensions, XR_KHR_loader_init and XR_KHR_loader_init_android, added in the 1.0.11 OpenXR specification. Apps are required to first get the function pointer for xrInitializeLoaderKHR via xrGetInstanceProcAddress with a null instance pointer, and then call xrInitializeLoaderKHR with the XrLoaderInitInfoAndroidKHR struct defined in XR_KHR_loader_init_android.

KHR Android Instance Creation Extension

Apps are required to specify platform-specific parameters to xrCreateInstance through the XR_KHR_android_create_instance extension.

Getting Started with OpenXR Mobile SDK

To start learning about OpenXR development on Meta headsets, you can go through Build and Run hello_xr Sample App. The hello_xr app is a simple cross-platform OpenXR sample in the Khronos Group’s OpenXR-SDK-Source GitHub repository at https://github.com/KhronosGroup/OpenXR-SDK-Source/tree/master/src/tests/hello_xr.

Samples

  • XrCompositor_NativeActivity - A simple scene using the Android NativeActivity class to illustrate the use of the different layer types available by way of OpenXR.
  • XrHandsFB - A showcase of the three Meta-specific hand extensions that let you render a skinned hand mesh, use collision capsules, and a ray-cast-plus-pinch UI interaction model. These put OpenXR hand support in parity with the corresponding VrApi hands API.
  • XrHandsAndControllers - A simple sample showing how to set up simultaneous hand + controller tracking and detect when controllers are held to build a multimodal interaction experience
  • XrPassthrough - A sample app demonstrating basic Passthrough functionality. Provides an input over controllers to switch through different features and styles, such as basic passthrough, still and animated styles, masked (selective) and projected (onto a mesh) passthrough.
  • XrSpatialAnchor - A sample app that demonstrates the capabilities of our Spatial Anchor system and provides example code for handling, maintaining, and sharing Spatial Anchors that you can use in your own project.
  • XrColorSpaceFB - An educational sample about color spaces that demonstrates how to use XR_FB_color_space to specify what color space an app is authored for.
  • XrControllers - A sample that shows how to access each of the input actions on the Meta Quest Touch Pro controller through OpenXR. It provides examples of using TruTouch advanced Haptics APIs with Meta Quest Touch Pro controllers.
  • XrDynamicObjects: Provides example usage of the Dynamic Object Tracker API, showing how keyboards can be tracked and presented via passthrough on Meta Quest 3 and beyond.
  • XrSpaceWarp - A sample that demonstrates Application SpaceWarp, a feature that achieves a step function improvement in both performance and latency.
  • XrBodyTrackingFB - A sample that demonstrates body tracking functionality to draw the skeleton joints on your body, arms, and hands with the corresponding joint coordinate frames overlayed at that joint.
  • XrEyeTrackingSocialFB - A demo of eye tracking features that displays eye gaze data for both eyes in a social context.
  • XrFaceTrackingFB - A face tracking sample app that displays weights of the corresponding blendshapes triggered/change when moving your face regions, such as mouth, cheeks, eyes, and so on.
  • XrSceneModel - An app that demonstrates a scene-aware experience that enables rich interaction with the user’s environment, such as floor, walls, and furniture.
  • XrVirtualKeyboard - A sample that demonstrates how to use Virtual Keyboard. The application has control over the positioning, interaction, and rendering of the keyboard.
  • XrMicrogestures - A sample that demonstrates how to use Microgestures. The application shows when the user has correctly performed a microgesture thumb swipe (left / right / backward / forward) or a thumb tap.

Resources

Did you find this page helpful?
Thumbs up icon
Thumbs down icon