This sample demonstrates how to use the Meta XR Core SDK compositor layer APIs to render UI, video, and other content at higher visual quality than standard Unity rendering. Five scenes each showcase a different aspect of compositor layers: visual quality comparison, performance stress testing, independent update cadence, available mesh shapes, and filtering options.
Clone the repository from GitHub and open the project in Unity. Load the default scene at Assets/CompositorLayers/Scenes/Intro.unity, then build and deploy to your Meta Quest device. Once running, press the Meta Quest menu button on your controller to open the scene selector and switch between the five demo scenes. For complete build and deployment instructions, see the sample README.
Explore the sample
Scene
What it demonstrates
Key concepts
Intro
Visual quality comparison between standard canvas rendering and compositor layer rendering
OVROverlayCanvas, sharpness improvement
Performance
Stress testing with dynamic compositor layer instantiation
When you run the Intro scene, you see a UI canvas that alternates between standard Unity rendering and compositor layer rendering every 2 seconds. The compositor layer version appears noticeably sharper, especially for text.
The Independence scene displays a spinning arc animation rendered by a Java background thread at ~60fps. When you toggle “Drop Frames,” the Unity app stutters visibly, but the compositor layer continues animating smoothly, demonstrating that compositor layers can maintain fluidity even when the main application drops frames.
In LayerShapes, you can toggle between seven different overlay shapes. Each shape is useful for different scenarios: quads for flat UI, cylinders for curved displays, cubemaps for skyboxes, and equirect for 360-degree video.
Key concepts
OVROverlayCanvas for UI compositor layers
The simplest way to render Unity UI canvases as compositor layers is to add an OVROverlayCanvas component to your Canvas GameObject. The Performance scene dynamically instantiates prefabs with OVROverlayCanvas and configures their texture resolution:
var newObject = Instantiate(CompositorLayerPrefab, CompositorLayersParent);
UpdateSettings(newObject.GetComponent<OVROverlayCanvas>());
Poke-a-Hole technique for depth integration
The sample uses a custom ZeroAlpha shader to create a “hole” in the scene geometry where the compositor layer appears. The shader writes to the depth buffer but outputs zero alpha, allowing the compositor layer to show through at the correct depth position:
var overlay = GetComponentInParent<OVROverlay>();
GetComponent<OVROverlayMeshGenerator>().SetOverlay(overlay);
External surface rendering via JNI
The Independence scene demonstrates rendering directly to an Android Surface from a Java background thread, bypassing Unity’s render loop entirely. The sample waits for OVROverlay.isExternalSurface and externalSurfaceObject to become valid, then calls LoadingScreen.java via JNI to render a spinning arc at ~60fps. This technique enables loading screens that remain smooth even when the app is performing expensive work.
Filtering for text and video quality
Compositor layers support three filtering properties that improve visual quality. The Filtering scene exposes each as a separate toggle:
public void SetBicubicFiltering(bool val) {
m_assocOverlay.useBicubicFiltering = val;
}
Bicubic filtering reduces pixelation during scaling, sharpening improves text edges, and supersampling reduces video shimmer.
Extend the sample
Modify the Performance scene to test different texture resolutions and measure frame rate impact using the Meta XR Simulator.
Build on the Independence scene to create a custom loading screen with your own Java rendering code.
Combine patterns from LayerShapes and Filtering to create a curved video player with optimized filtering settings.