Develop

Controller Input and Tracking

Updated: Mar 2, 2026

What is Controller Input and Tracking?

In XR applications in Unreal Engine, controllers can be used to accept user input, such as button presses or joystick movement, and track the movement of a user’s hands in space to perform interactions or gestures. Controllers provide a powerful yet familiar interface for users to interact inside of XR applications.
To streamline the process of adding controller-based interactions, such as grabbing objects, moving around the scene, or setting up user interfaces, see the Interaction SDK.

How does Controller Input and Tracking Work?

The Meta XR plugin for Unreal Engine exposes a unified input API for multiple controller types. It is used to query virtual or raw controller state, such as buttons, thumbsticks, triggers, and capacitive touch data.

How do I set up Controller Input and Tracking

You must have the Meta XR plugin for Unreal Engine installed and enabled in your project to use controller input and tracking. The plugin is available on the Unreal Engine Marketplace.
Controller tracking uses specialized components attached to a Pawn Actor. These components provide an API to access data that includes positions and rotation for the controllers. You can access these positions and rotations for the grip, palm, and aim of the controllers.
Mapping of hardware input to events you can respond to in Blueprints or C++ code are created using the Enhanced Input system in Unreal Engine. All available inputs for Meta Quest controllers are made available to this system by the Meta XR plugin. For more information on setting up input mapping in Unreal Engine, see the Enhanced Input documentation.

Learn more

To learn more about using controllers in XR applications in Unreal Engine, see the following guides:

Design guidelines

Design guidelines are Meta’s human interface standards and design frameworks that help you create safe, user-oriented, and retainable immersive and passthrough user experiences.

Inputs

  • Input modalities: Explore the different input modalities.
  • Head: Design and UX best practices for head input.
  • Hands: Design and UX best practices for using hands.
  • Controllers: Design and UX best practices for using controllers.
  • Voice: Design and UX best practices for using voice.
  • Peripherals: Design and UX best practices for using peripherals.

Core interactions

  • Input mappings: Understand how input mappings bridge modalities and interaction types.
  • Input hierarchy: Understand the input hierarchy.
  • Multimodality: Understand multimodality.
  • Ray casting: Understand indirect interaction through ray casting.
  • Touch: Understand direct interaction through touch.
  • Grab: Understand grab interactions for object manipulation.
  • Microgestures: Understand microgesture interactions.