Develop
Develop
Select your platform

Controller Input and Tracking Overview

Updated: Nov 10, 2025

What is Controller Input and Tracking?

In XR applications in Unity, controllers can be used to accept user input, such as button presses or joystick movement, and track the movement of a user’s hands in space to perform interactions or gestures. Controllers provide a powerful yet familiar interface for users to interact in XR applications.
Note
To streamline the process of adding controller-based interactions, such as grabbing objects, moving around the scene, or setting up user interfaces, see the Interaction SDK Overview.

How does Controller Input and Tracking work?

OVRInput exposes a unified input API for multiple controller types. It is used to query virtual or raw controller state, such as buttons, thumbsticks, triggers, and capacitive touch data. It supports the Meta Quest Touch controllers.
For keyboard and mouse control, we recommend using the UnityEngine.Input scripting API (see Unity’s Input scripting reference for more information).
Mobile input bindings are automatically added to InputManager.asset if they do not already exist.
For more information, go to the OVRInput class reference. In addition, you can learn more about Unity’s Input System and Input Manager.

Touch Tracking

OVRInput provides touch position and orientation data through GetLocalControllerPosition() and GetLocalControllerRotation(), which return a Vector3 and Quaternion, respectively.
Controller poses are returned by the tracking system and are predicted simultaneously with the headset. These poses are reported in the same coordinate frame as the headset, relative to the initial center eye pose, and can be used for rendering hands or objects in the 3D world. They are also reset by OVRManager.display.RecenterPose(), similar to the head and eye poses.
Note: Meta Quest Touch controllers are differentiated with Primary and Secondary in OVRInput: Primary always refers to the left controller and Secondary always refers to the right controller. </oc-docs-device>

How do I set up Controller Input and Tracking?

To access the controller input and tracking functionality, include the following in your Unity project:
  • Instance of OVRManager, placed anywhere in the scene
  • Call OVRInput.Update() and OVRInput.FixedUpdate() once per frame at the beginning of any component’s Update and FixedUpdate methods, respectively.

OVRInput Usage

The primary usage of OVRInput is to access controller input state through Get(), GetDown(), and GetUp().
  • Get() queries the current state of a controller.
  • GetDown() queries if a controller was pressed this frame.
  • GetUp() queries if a controller was released this frame.

Control Input Enumerations

There are multiple variations of Get() that provide access to different sets of controls. These sets of controls are exposed through enumerations defined by OVRInput as follows:
ControlEnumerates
OVRInput.Button
Traditional buttons found on gamepads, controllers, and the back button.
OVRInput.Touch
Capacitive-sensitive control surfaces found on the controller.
OVRInput.NearTouch
Proximity-sensitive control surfaces found on the controller.
OVRInput.Axis1D
One-dimensional controls such as triggers that report a floating point state.
OVRInput.Axis2D
Two-dimensional controls including thumbsticks. Reports a Vector2 state.
A secondary set of enumerations mirrors the first, defined as follows:
Control
OVRInput.RawButton
OVRInput.RawTouch
OVRInput.RawNearTouch
OVRInput.RawAxis1D
OVRInput.RawAxis2D
The first set of enumerations provides a virtualized input mapping that is intended to assist developers with creating control schemes that work across different types of controllers. The second set of enumerations provides raw unmodified access to the underlying state of the controllers. We recommend using the first set of enumerations because the virtual mapping provides useful functionality, as demonstrated below.

Button, Touch, and NearTouch

In addition to traditional gamepad buttons, the controllers feature capacitive-sensitive control surfaces which detect when the user’s fingers or thumbs make physical contact (Touch), as well as when they are in close proximity (NearTouch). This allows for detecting several distinct states of a user’s interaction with a specific control surface. For example, if a user’s index finger is fully removed from a control surface, the NearTouch for that control will report false. As the user’s finger approaches the control and gets within close proximity to it, the NearTouch will report true prior to the user making physical contact. When the user makes physical contact, the Touch for that control will report true. When the user pushes the index trigger down, the Button for that control will report true. These distinct states can be used to accurately detect the user’s interaction with the controller and enable a variety of control schemes.

Example Usage

// returns true if the primary button (typically “A”) is currently pressed.
OVRInput.Get(OVRInput.Button.One);

// returns true if the primary button (typically “A”) was pressed this frame.
OVRInput.GetDown(OVRInput.Button.One);

// returns true if the “X” button was released this frame.
OVRInput.GetUp(OVRInput.RawButton.X);

// returns a Vector2 of the primary (typically the Left) thumbstick’s current state.
// (X/Y range of -1.0f to 1.0f)
OVRInput.Get(OVRInput.Axis2D.PrimaryThumbstick);

// returns true if the primary thumbstick is currently pressed (clicked as a button)
OVRInput.Get(OVRInput.Button.PrimaryThumbstick);

// returns true if the primary thumbstick has been moved upwards more than halfway.
// (Up/Down/Left/Right - Interpret the thumbstick as a D-pad).
OVRInput.Get(OVRInput.Button.PrimaryThumbstickUp);

// returns a float of the secondary (typically the Right) index finger trigger’s current state.
// (range of 0.0f to 1.0f)
OVRInput.Get(OVRInput.Axis1D.SecondaryIndexTrigger);

// returns a float of the left index finger trigger’s current state.
// (range of 0.0f to 1.0f)
OVRInput.Get(OVRInput.RawAxis1D.LIndexTrigger);

// returns true if the left index finger trigger has been pressed more than halfway.
// (Interpret the trigger as a button).
OVRInput.Get(OVRInput.RawButton.LIndexTrigger);

// returns true if the secondary gamepad button, typically “B”, is currently touched by the user.
OVRInput.Get(OVRInput.Touch.Two);
In addition to specifying a control, Get() also takes an optional controller parameter. The list of supported controllers is defined in the OVRInput.Controller enumeration.
Specifying a controller can be used if a particular control scheme is intended only for a certain controller type. If no controller parameter is provided to Get(), the default is to use the Active controller, which corresponds to the controller that most recently reported user input. For example, a user may use a pair of controllers, set them down, and pick up an Xbox controller, in which case the Active controller will switch to the Xbox controller once the user provides input with it. The current Active controller can be queried with OVRInput.GetActiveController() and a bitmask of all the connected Controllers can be queried with OVRInput.GetConnectedControllers().
Example Usage:
// returns a float of the hand trigger’s current state on the left controller.
OVRInput.Get(OVRInput.Axis1D.PrimaryHandTrigger, OVRInput.Controller.Touch);

// returns a float of the hand trigger’s current state on the right controller.
OVRInput.Get(OVRInput.Axis1D.SecondaryHandTrigger, OVRInput.Controller.Touch);
Note: Meta Quest Touch controllers can be specified either as the combined pair (with OVRInput.Controller.Touch), or individually (with OVRInput.Controller.LTouch and RTouch). This is significant because specifying LTouch or RTouch uses a different set of virtual input mappings that allow more convenient development of hand-agnostic input code.
Example Usage:
// returns a float of the hand trigger’s current state on the left controller.
OVRInput.Get(OVRInput.Axis1D.PrimaryHandTrigger, OVRInput.Controller.LTouch);

// returns a float of the hand trigger’s current state on the right controller.
OVRInput.Get(OVRInput.Axis1D.PrimaryHandTrigger, OVRInput.Controller.RTouch);
This can be taken a step further to allow the same code to be used for either hand by specifying the controller in a variable that is set externally, such as on a public variable in Unity Editor.
Example Usage:
// public variable that can be set to LTouch or RTouch in the Unity Inspector
public Controller controller;

// returns a float of the hand trigger’s current state on the controller
// specified by the controller variable.
OVRInput.Get(OVRInput.Axis1D.PrimaryHandTrigger, controller);

// returns true if the primary button (“A” or “X”) is pressed on the controller
// specified by the controller variable.
OVRInput.Get(OVRInput.Button.One, controller);
This is convenient since it avoids the common pattern of if/else checks for left or right hand input mappings.

Touch Input Mapping

The following diagrams illustrate common input mappings for the controllers. For more information on additional mappings that are available, refer to the OVRInput class reference.

Virtual Mapping (Accessed as a Combined Controller)

When accessing controllers as a combined pair with OVRInput.Controller.Touch, the virtual mapping closely matches the layout of a typical gamepad split across the left and right hands.
Combined controller mapping diagram

Virtual Mapping (Accessed as Individual Controllers)

When accessing the left or right controller individually with OVRInput.Controller.LTouch or OVRInput.Controller.RTouch, the virtual mapping changes to allow for hand-agnostic input bindings. For example, the same script can dynamically query the left or right controller depending on which hand it is attached to, and Button.One is mapped appropriately to either the A or X button.
Individual controller mapping diagram

Raw Mapping

The raw mapping directly exposes the controllers. The layout of the controllers closely matches the layout of a typical gamepad split across the left and right hands.
Raw controller mapping diagram

Learn more

To learn more about using controllers in XR applications in Unity, see the following guides:

Design guidelines

  • Input modalities: Learn about the different input modalities.
  • Head: Learn about design and UX best practices for using a head input.
  • Hands: Learn about design and UX best practices for using hands.
  • Controllers: Learn about design and UX best practices for using controllers.
  • Voice: Learn about design and UX best practices for using voice.
  • Peripherals: Learn about design and UX best practices for using peripherals.
Did you find this page helpful?
Thumbs up icon
Thumbs down icon