This guide will get you started prototyping with hand-based interactions. It walks you through building a UI with hand interactions and testing your prototype.
Before you start
Complete the Hardware setup and Software setup guides first. The Software setup includes account creation, software installation, headset configuration, and project creation.
Review the design guidance — Read the Hands overview to understand how hand tracking works in Meta Horizon OS, and check Hands Interaction Types to learn when to use poke, ray, and grab.
Follow the best practices — The Hands Best Practices guide covers ergonomics, common pitfalls, and do’s and don’ts for hand-based interactions.
Testing methods
There are two ways to test your prototype, each with a different speed-fidelity trade-off. Meta XR Simulator lets you quickly check if things work; Build and Run gives you higher fidelity with real device performance.
Meta XR Simulator (fastest)
Build and Run (slowest)
Requires headset
No
Yes
Build step
None — press Play
Full APK build + deploy
Hand tracking
Simulated (keyboard/mouse)
Real (native on device)
Fidelity
Layout and interaction logic
Highest — true device performance
Windows users: You can also use Meta Horizon Link to stream your app to the headset without building an APK. Link gives you real hand tracking with no build step, which is useful for testing interaction feel and ergonomics before a full device build. See Meta Horizon Link setup for details.
Use Meta XR Simulator for rapid iteration, then Build and Run for final validation on the actual device.
Design and build your app
Set up your camera rig, build a UI with buttons, sliders, and toggles, and wire up hand interactions.
Add the Interaction SDK camera rig to your scene and enable hand tracking. This replaces the default Main Camera with a rig that supports hands, controllers, and common interactions out of the box.
01 Add the Interaction Rig
02 Configure the Wizard
03 Create the Rig
04 Enable Hand Tracking
Right-click in the Hierarchy and select Interaction SDK > Add OVR Interaction Rig.
The Comprehensive Rig Wizard opens. If the wizard says there's no OVRCameraRig in the scene, click Fix All, it will create one for you.
Click Create. The wizard adds the OVR Interaction Rig to your scene with everything wired up, hand tracking, controller tracking, raycast, poke, grab, and teleport interactions are all included.
OvrCameraRig features in the prefab.
Select OVRCameraRig in the Hierarchy.
In the Inspector, go to OVR Manager > Quest Features > General tab.
Set Hand Tracking Support to Controllers and Hands (recommended for prototyping). This lets you switch between hands and controllers freely.
Design your interactions so users can keep their arms close to their body with elbows at hip level. Requiring hands above heart level causes rapid fatigue. Place interactive content at a comfortable distance, about 1 to 1.5 meters away, for UI panels. See Hands Best Practices for ergonomic guidelines.
Create a world-space Canvas and add a Panel to serve as the background surface for your UI elements.
01 Create the Canvas
02 Configure the Canvas
03 Add a Panel
In the Hierarchy, right-click and select UI (Canvas) > Canvas. A new Canvas appears in the Hierarchy.
Right-click and select UI > Canvas.
Select the Canvas in the Hierarchy.
In the Inspector, set Canvas > Render Mode to World Space.
Set the Rect Transform values to PosX: 0, PosY: 1.5, PosZ: 1.5, Width: 480, Height: 720, and Scale: 0.0005 on all three axes (X, Y, Z).
This positions the Canvas at eye level about 1.5 meters in front of you and scales it down to a comfortable size for VR. For more approaches to building UI, including the UI Set toolkit, see ISDK UI Overview.
Right-click on the Canvas in the Hierarchy and select UI > Panel. The Panel gives you a background surface for your UI elements. You can change the panel color in the Inspector under Image > Color (check our color guidelines for more).
Avoid pure white (#FFFFFF) and pure black (#000000) in your UI. They cause eye strain in Passthrough or MR. Use a maximum background brightness of #DADADA for light themes. Colors appear more saturated in-headset than on your monitor, so always verify on the actual device. See Display Guidelines for more details.
Add interactive UI components to your panel. A button, a slider, and a toggle to give you a feel for the different UI patterns available for hand interaction.
For hand tracking, touch targets must be at least 48dp x 48dp (about 22mm x 22mm) and spaced at least 12mm apart to prevent accidental touches. The button size of 300 x 100 in this guide exceeds that minimum, but keep these numbers in mind as you add more elements. For text, use a minimum font size of 14px (18px recommended for comfortable reading) and maintain a 4.5:1 contrast ratio. See Hands Interaction Types and Fonts and Icons for detailed specs.
01 Add a Button
02 Add a Slider
03 Add a Toggle
04 Canvas Validation
Right-click on the Panel and select UI > Button - TextMeshPro.
In the Inspector, set the Rect TransformWidth to 300 and Height to 100 to make it a comfortable size for hand interaction.
You can change the button label text by expanding the button in the Hierarchy and selecting the text child object.
Right-click on the Panel and select UI > Slider.
Set the Width to 300 and Y Pos to be above or below the button.
Sliders work with both poke (drag the handle with your finger) and ray interactions. For design guidance on slider types, see slider design guidelines.
Right-click on the Panel and select UI > Toggle.
Set Width to 300 and Height to 80 and the Y Pos to be above or below the other UI components.
Toggles work well for on/off settings. Feel free to add more components, dropdowns, input fields, or additional buttons, to build out your prototype UI.
When done, your canvas should look something like this with all your components evenly spaced.
Validate your canvas is set up with all its components.
For production-quality UI, consider using the Meta Horizon OS UI Set prefabs instead of standard Unity UI. Find them in Project > Packages > Meta XR Interaction SDK Essentials > Runtime > Sample > Objects > UISet > Prefabs. These include pre-styled buttons, sliders, toggles, and more with built-in interaction support.
Now connect the Interaction SDK to your Canvas so you can poke buttons with your fingertips and use ray interactions from a distance.
Always support both poke and ray on UI surfaces. Don't rely on only one. Some users prefer direct touch, others prefer pointing from a distance, and some may have accessibility needs that favor one over the other. Also, hands have no haptic feedback unlike controllers, so provide clear audio and visual cues (color changes, animations, sounds) on every interaction. See Accessibility Guidelines for inclusive design practices.
01 Add Poke Interaction
02 Add Ray Interaction
03 Verify Setup
Right-click on the Canvas in the Hierarchy and select Interaction SDK > Add Poke Interaction to Canvas.
The Poke wizard appears. If it flags a missing PointableCanvasModule, click Fix. This adds the PointableCanvasModule to your EventSystem, which routes touch events to your Canvas.
Click Create. You can now poke buttons, drag sliders, and tap toggles with your fingertips.
Right-click on the Canvas again and select Interaction SDK > Add Ray Interaction to Canvas.
The Ray wizard appears, fix any missing components if prompted, then click Create.
This lets you interact with your UI from a distance using a ray that projects from your pinched fingers or controller.
Your Canvas should now have PokeInteractable, RayInteractable, and PointableCanvas components attached (you can verify in the Inspector). The Interaction SDK rig you added earlier already has the hand and controller interactors, so everything is wired up. You're ready to test!
You’ve built a hand-tracked UI and now it’s time to test it. Pick the method that fits your setup.
Meta XR Simulator
Build and Run
The fastest way to iterate, no headset required. Meta XR Simulator runs your app directly in the Unity Editor using keyboard and mouse input to approximate head and hand tracking. Works on both Windows and macOS (Apple Silicon only, Unity OpenXR Plugin v1.13.0+, Meta XR SDK v66+).
Limitations: Input is approximated (not real hand tracking), rendering and performance won’t match the actual headset. You must still do a final on-device test before shipping.
01 Activate the Simulator
02 Press Play
03 Switch to Hand Input
04 Test Your UI
Click the XR Simulator icon next to the Play button in Unity's top toolbar, or go to Meta > Meta XR Simulator > Activate. You'll see a console log confirming [Meta XR Simulator is activated].
Activate Meta XR Simulator.
Check to see if Meta XR Simulator is activated.
Press the Play button in the Unity Editor. The Meta XR Simulator window opens, showing your scene from the simulated headset's perspective. Use the mouse to look around and WASD keys to move.
Seeing an empty world with no hands, controllers, or canvas? If you get the error No Render Pipeline Asset was found, your project is missing a Universal Render Pipeline (URP) asset. OpenXR requires URP to apply XR rendering fixes. To fix this:
1. If you don't have a URP Asset yet, right-click in the Project window and select Create > Rendering > URP Asset (with Universal Renderer). 2. Go to Edit > Project Settings > Graphics and assign your URP Asset in the Scriptable Render Pipeline Settings field. 3. Go to Edit > Project Settings > Quality and assign the same URP Asset for each quality level you use.
In the Meta XR Simulator panel, go to Inputs > Global input settings.
In the Left and Right dropdowns, switch from Controller to Hand.
Move your simulated hands toward the Canvas and try poking the button, dragging the slider, and tapping the toggle.
Keys 1 through 4 switch between aim, poke, pinch, and grab poses. Left mouse button triggers a pinch gesture.
When you're done, press Play again to stop.
Go to Meta > Meta XR Simulator > Deactivate before switching to headset testing.
Meta XR Simulator supports additional features beyond what this guide covers, including controller emulation, room configuration, and synthetic environments. See the Meta XR Simulator overview to learn more.
Next steps
Designing for hands
Explore the design guidelines to refine your hand-based experience and understand the principles behind great hand interactions.
Hands Overview — How hand tracking works in Meta Horizon OS and how to think about designing for hands