Input - Controllers and Hand Tracking
Updated: Apr 14, 2026
Meta Quest devices support multiple input methods, from tracked controllers to hand tracking. The following pages cover each input method and related features.
- Controller Input and Tracking
The Meta XR plugin provides a unified input API for controller input and tracking, including Enhanced Input support.
- Touch Controller
Touch controllers are the standard tracked controller for Meta Quest headsets.
- Hand Tracking
Hand Tracking enables users’ hands as input devices. Hand Tracking can be enabled on its own or with controllers.
- Interaction SDK
The Interaction SDK (ISDK) provides pre-built interaction components for poke, grab, and ray interactions.
- Haptic Feedback
Haptic feedback triggers vibration effects on Touch controllers through Unreal Blueprints.
- Controller Input Mapping
Controller actions such as thumbstick presses map to corresponding events that can be handled through Blueprints.
Design guidelines are Meta’s human interface standards and design frameworks that help you create safe, user-oriented, and retainable immersive and passthrough user experiences.
- Input modalities: Explore the different input modalities.
- Head: Design and UX best practices for head input.
- Hands: Design and UX best practices for using hands.
- Controllers: Design and UX best practices for using controllers.
- Voice: Design and UX best practices for using voice.
- Peripherals: Design and UX best practices for using peripherals.