Oculus Touch Controller
Updated: Mar 2, 2026
Touch controllers are the standard tracked controller for Rift headsets.
In UE 4.25 and later, you can map inputs in the Unreal Engine project settings. In older versions of UE, use Blueprints to access input event generated by the Touch controller. For more information, see
Controller Input Mapping.
To learn more about using controllers in XR applications in Unreal Engine, see the following guides:
Design guidelines are Meta’s human interface standards and design frameworks that help you create safe, user-oriented, and retainable immersive and passthrough user experiences.
- Input modalities: Explore the different input modalities.
- Head: Design and UX best practices for head input.
- Hands: Design and UX best practices for using hands.
- Controllers: Design and UX best practices for using controllers.
- Voice: Design and UX best practices for using voice.
- Peripherals: Design and UX best practices for using peripherals.
- Input mappings: Understand how input mappings bridge modalities and interaction types.
- Input hierarchy: Understand the input hierarchy.
- Multimodality: Understand multimodality.
- Ray casting: Understand indirect interaction through ray casting.
- Touch: Understand direct interaction through touch.
- Grab: Understand grab interactions for object manipulation.
- Microgestures: Understand microgesture interactions.