Design
Design

Hands Prototype Quickstart Guide

Updated: Mar 23, 2026
This guide will get you started prototyping with hand-based interactions. It walks you through building a UI with hand interactions and testing your prototype.

Before you start

Complete the Hardware setup and Software setup guides first. The Software setup includes account creation, software installation, headset configuration, and project creation.
Already completed the Unity Hello World tutorial? Your project is already set up — skip ahead to Design and build your app.

Try hand-based interactions

Spending some time experiencing great hand interactions and reviewing design guidance can help you make better design decisions as you prototype.

Testing methods

There are two ways to test your prototype, each with a different speed-fidelity trade-off. Meta XR Simulator lets you quickly check if things work; Build and Run gives you higher fidelity with real device performance.
 Meta XR Simulator (fastest)Build and Run (slowest)
Requires headset
No
Yes
Build step
None — press Play
Full APK build + deploy
Hand tracking
Simulated (keyboard/mouse)
Real (native on device)
Fidelity
Layout and interaction logic
Highest — true device performance

Windows users: You can also use Meta Horizon Link to stream your app to the headset without building an APK. Link gives you real hand tracking with no build step, which is useful for testing interaction feel and ergonomics before a full device build. See Meta Horizon Link setup for details.

Use Meta XR Simulator for rapid iteration, then Build and Run for final validation on the actual device.

Design and build your app

Set up your camera rig, build a UI with buttons, sliders, and toggles, and wire up hand interactions.

Add the Interaction SDK camera rig to your scene and enable hand tracking. This replaces the default Main Camera with a rig that supports hands, controllers, and common interactions out of the box.

Right-click in the Hierarchy and select Interaction SDK > Add OVR Interaction Rig.


Design your interactions so users can keep their arms close to their body with elbows at hip level. Requiring hands above heart level causes rapid fatigue. Place interactive content at a comfortable distance, about 1 to 1.5 meters away, for UI panels. See Hands Best Practices for ergonomic guidelines.
For any troubleshooting issues, see:

Create a world-space Canvas and add a Panel to serve as the background surface for your UI elements.

In the Hierarchy, right-click and select UI (Canvas) > Canvas. A new Canvas appears in the Hierarchy.
Right-click and select UI > Canvas.

Right-click and select UI > Canvas.


Avoid pure white (#FFFFFF) and pure black (#000000) in your UI. They cause eye strain in Passthrough or MR. Use a maximum background brightness of #DADADA for light themes. Colors appear more saturated in-headset than on your monitor, so always verify on the actual device. See Display Guidelines for more details.

Add interactive UI components to your panel. A button, a slider, and a toggle to give you a feel for the different UI patterns available for hand interaction.

For hand tracking, touch targets must be at least 48dp x 48dp (about 22mm x 22mm) and spaced at least 12mm apart to prevent accidental touches. The button size of 300 x 100 in this guide exceeds that minimum, but keep these numbers in mind as you add more elements. For text, use a minimum font size of 14px (18px recommended for comfortable reading) and maintain a 4.5:1 contrast ratio. See Hands Interaction Types and Fonts and Icons for detailed specs.

  1. Right-click on the Panel and select UI > Button - TextMeshPro.
  2. In the Inspector, set the Rect TransformWidth to 300 and Height to 100 to make it a comfortable size for hand interaction.
  3. You can change the button label text by expanding the button in the Hierarchy and selecting the text child object.


For production-quality UI, consider using the Meta Horizon OS UI Set prefabs instead of standard Unity UI. Find them in Project > Packages > Meta XR Interaction SDK Essentials > Runtime > Sample > Objects > UISet > Prefabs. These include pre-styled buttons, sliders, toggles, and more with built-in interaction support.
For UI guidelines, see:

Now connect the Interaction SDK to your Canvas so you can poke buttons with your fingertips and use ray interactions from a distance.

Always support both poke and ray on UI surfaces. Don't rely on only one. Some users prefer direct touch, others prefer pointing from a distance, and some may have accessibility needs that favor one over the other. Also, hands have no haptic feedback unlike controllers, so provide clear audio and visual cues (color changes, animations, sounds) on every interaction. See Accessibility Guidelines for inclusive design practices.

  1. Right-click on the Canvas in the Hierarchy and select Interaction SDK > Add Poke Interaction to Canvas.
  2. The Poke wizard appears. If it flags a missing PointableCanvasModule, click Fix. This adds the PointableCanvasModule to your EventSystem, which routes touch events to your Canvas.
  3. Click Create. You can now poke buttons, drag sliders, and tap toggles with your fingertips.

For any troubleshooting issues, see:

Deploy and test

You’ve built a hand-tracked UI and now it’s time to test it. Pick the method that fits your setup.
The fastest way to iterate, no headset required. Meta XR Simulator runs your app directly in the Unity Editor using keyboard and mouse input to approximate head and hand tracking. Works on both Windows and macOS (Apple Silicon only, Unity OpenXR Plugin v1.13.0+, Meta XR SDK v66+).
Best for: Rapid prototyping, layout testing, verifying interactions work, macOS development.
Limitations: Input is approximated (not real hand tracking), rendering and performance won’t match the actual headset. You must still do a final on-device test before shipping.

Click the XR Simulator icon next to the Play button in Unity's top toolbar, or go to Meta > Meta XR Simulator > Activate. You'll see a console log confirming [Meta XR Simulator is activated].

Activate Meta XR Simulator.

Check to see if Meta XR Simulator is activated

Check to see if Meta XR Simulator is activated.

For any troubleshooting issues, see:

Meta XR Simulator supports additional features beyond what this guide covers, including controller emulation, room configuration, and synthetic environments. See the Meta XR Simulator overview to learn more.

Next steps

Designing for hands

Explore the design guidelines to refine your hand-based experience and understand the principles behind great hand interactions.
  • Hands Overview — How hand tracking works in Meta Horizon OS and how to think about designing for hands
  • Hands Interaction Types — Deep dive into poke, ray, grab, and when to use each
  • Hands Best Practices — Ergonomic guidelines, common pitfalls, and do’s and don’ts
  • Hands Examples — Real app examples of hands-based interactions for inspiration
  • Hand Representation — How to render hands in your app (system hands, custom models, transparency)
  • Panels — Panel sizing, placement, and layout for spatial UI
  • Accessibility — Inclusive design for diverse hand sizes, abilities, and input preferences

Developing with the Interaction SDK

Go deeper with the Interaction SDK to add advanced interactions, custom gestures, and grabbable objects to your prototype.
Did you find this page helpful?