In XR applications in Unreal Engine, controllers can be used to accept user input, such as button presses or joystick movement, and track the movement of a user’s hands in space to perform interactions or gestures. Controllers provide a powerful yet familiar interface for users to interact inside of XR applications.
Note
To streamline the process of adding controller-based interactions, such as grabbing objects, moving around the scene, or setting up user interfaces, see the Interaction SDK.
How does Controller Input and Tracking Work?
The Meta XR plugin for Unreal Engine exposes a unified input API for multiple controller types. It is used to query virtual or raw controller state, such as buttons, thumbsticks, triggers, and capacitive touch data.
How do I set up Controller Input and Tracking
You must have the Meta XR plugin for Unreal Engine installed and enabled in your project to use controller input and tracking. The plugin is available on the Unreal Engine Marketplace.
Controller tracking uses specialized components attached to a Pawn Actor. These components provide an API to access data that includes positions and rotation for the controllers. You can access these positions and rotations for the grip, palm, and aim of the controllers.
Mapping of hardware input to events you can respond to in Blueprints or C++ code are created using the Enhanced Input system in Unreal Engine. All available inputs for Meta Quest controllers are made available to this system by the Meta XR plugin. For more information on setting up input mapping in Unreal Engine, see the Enhanced Input documentation.
Learn more
To learn more about using controllers in XR applications in Unreal Engine, see the following guides: