Develop

Oculus Touch Controller

Updated: Mar 2, 2026

Overview

Touch controllers are the standard tracked controller for Rift headsets.

Key Components

Handling Controller Input

In UE 4.25 and later, you can map inputs in the Unreal Engine project settings. In older versions of UE, use Blueprints to access input event generated by the Touch controller. For more information, see Controller Input Mapping.
Haptics for Touch Controllers describes how to use Unreal Blueprints to control haptic effects on Touch controllers.

Learn more

To learn more about using controllers in XR applications in Unreal Engine, see the following guides:

Design guidelines

Design guidelines are Meta’s human interface standards and design frameworks that help you create safe, user-oriented, and retainable immersive and passthrough user experiences.

Inputs

  • Input modalities: Explore the different input modalities.
  • Head: Design and UX best practices for head input.
  • Hands: Design and UX best practices for using hands.
  • Controllers: Design and UX best practices for using controllers.
  • Voice: Design and UX best practices for using voice.
  • Peripherals: Design and UX best practices for using peripherals.

Core interactions

  • Input mappings: Understand how input mappings bridge modalities and interaction types.
  • Input hierarchy: Understand the input hierarchy.
  • Multimodality: Understand multimodality.
  • Ray casting: Understand indirect interaction through ray casting.
  • Touch: Understand direct interaction through touch.
  • Grab: Understand grab interactions for object manipulation.
  • Microgestures: Understand microgesture interactions.