Develop
Develop
Select your platform

Meta Quest Touch Controller

Updated: Apr 14, 2026

Overview

Meta Quest Touch controllers are the standard input devices for Meta Quest headsets, providing 6DoF tracked input with buttons, triggers, thumbsticks, touch sensors, and haptics. Three generations of Touch controllers exist, each adding advanced sensing capabilities.
The Meta XR Plugin exposes Touch controller inputs as FKey entries with the OculusTouch_ prefix (for example, OculusTouch_Left_Trigger_Axis, OculusTouch_Right_A_Click). Standard inputs available across all controller types include:
  • Trigger axis and grip axis
  • Thumbstick X/Y
  • Face buttons (X/Y on the left controller, A/B on the right)
  • Menu button
  • Capacitive touch sensors for thumbsticks, triggers, face buttons, index pointing, thumb up, and thumb rest
You can detect the active controller type at runtime by calling UOculusXRFunctionLibrary::GetControllerType().

Controller generations

Three generations of Meta Quest Touch controllers are available:
Meta Quest Touch — Shipped with Meta Quest 2 and Meta Quest 3S. Provides 6DoF tracking, analog triggers and grips, thumbsticks, A/B/X/Y buttons, capacitive touch sensors, and haptic feedback.
Meta Quest Touch Pro — Shipped with Meta Quest Pro. Adds pressure-sensitive thumbrest force sensing, stylus force sensor, trigger curl and slide sensing, and localized haptics with Thumb and Index zone targeting via EOculusXRHandHapticsLocation.
Meta Quest Touch Plus — Shipped with Meta Quest 3. Adds trigger force sensing and trigger curl and slide sensing.

Handling controller input

Map Touch controller inputs in Unreal Engine project settings using the Enhanced Input system (UInputAction and UInputMappingContext). For a full list of available input keys and their mappings, see Controller Input Mapping.

Haptics

Meta Quest Touch controllers support a range of haptic effects including curve-based, buffer-based, amplitude envelope, and sound wave haptic playback. Touch Pro controllers additionally support localized haptics targeting the Thumb and Index zones.
For details on configuring haptic feedback, see Haptic Feedback.

Learn more

To learn more about using controllers in XR applications in Unreal Engine, see the following guides:

Design guidelines

Design guidelines are Meta’s human interface standards and design frameworks that help you create safe, user-oriented, and retainable immersive and passthrough user experiences.

Inputs

  • Input modalities: Explore the different input modalities.
  • Head: Design and UX best practices for head input.
  • Hands: Design and UX best practices for using hands.
  • Controllers: Design and UX best practices for using controllers.
  • Voice: Design and UX best practices for using voice.
  • Peripherals: Design and UX best practices for using peripherals.

Core interactions

  • Input mappings: Understand how input mappings bridge modalities and interaction types.
  • Input hierarchy: Understand the input hierarchy.
  • Multimodality: Understand multimodality.
  • Ray casting: Understand indirect interaction through ray casting.
  • Touch: Understand direct interaction through touch.
  • Grab: Understand grab interactions for object manipulation.
  • Microgestures: Understand microgesture interactions.