Develop
Develop
Select your platform

Use Simultaneous Hands and Controllers (Multimodal)

Updated: Mar 13, 2026
Multimodal input provides simultaneous tracking of both hands and controllers. It also indicates whether the controller(s) are in hand or not. Multimodal input allows users to combine hands for immersion with controllers for accuracy and haptics. When enabled, multimodal input overrides other transition methods, including auto transitions and double tap.

Benefits of multimodal input

  • Hand and controller gameplay: Use a controller in one hand as a tool or weapon (high accuracy, haptics, wide FoV), while keeping your other hand free to interact with objects or cast spells (high immersion).
  • Single controller gameplay: If your experience only requires one controller for most interactions, allow the users to only pick up one controller and use their other hand for immersion and casual interactions when needed (for example, use a controller as a racket and use your free hand to pick up the ball and interact with menus).
  • Use hands and controllers interchangeably for comfort: Use one hand for direct interactions with controls, while the other hand is holding a controller as a “remote control” for low effort indirect interactions.
  • Instant transition between hands and controllers: With hand tracking active, we can instantly identify when the controllers are no longer in the hand. This minimizes the need for manual transitions and solves the problems of slow or failed auto transitions.
  • Find my controllers: With controllers now tracked when using hands, the app can show the user where the controllers are when they want to pick them up. This allows smoother transition back to controllers without having to break immersion by turning on passthrough/ taking off the headset.

Known limitations

  • The ‘in hand’ signal is based on various imperfect signals including hand and controller pose and controller signals. As a result, the system may indicate that the controller not in hand in certain scenarios where tracking is lost or inaccurate, or controllers are held still for some time. It is recommended to design with that limitation in mind (for example, avoid dropping objects from a hand due to false short transitions from controllers to hands).
  • When the pause function is called, the application will switch back into the “non-simultaneous” mode that traditional hands+controllers apps run in, where the user can use either hands or controllers at a given time. The tracking system may take a few seconds to recognize and decide on the correct input to enable, depending on whether the user is holding controllers when this happens.
  • When using Link on PC, pose data for controllers is unavailable when you’re not actively using them (such as when they’re lying on a table).

Compatibility

Hardware compatibility

  • Quest 2
  • Quest Pro
  • Quest 3
  • Quest 3S

Software compatibility

  • Meta XR plugin v74 and above with OpenXR backend
  • Unreal Engine 5.5 and above

Feature compatibility

  • Multimodal input is incompatible with Inside-Out Body Tracking (IOBT) and Full Body Synthesis (FBS). You shouldn’t enable them together.
  • Multimodal input cannot be enabled together with Fast Motion Mode (FMM). If both are enabled together, Multimodal will take precedence. As FMM is defined in the manifest, you may enable FMM at the app level, and then turn on multimodal only in specific experiences where FMM is less important.
  • On Quest 2, Multimodal cannot be enabled together with LipSync
  • Passthrough, Multimodal input, and Wide Motion Mode (WMM) cannot be enabled together. If they all are turned on together, the system will disable WMM.
  • Full compatibility with capsense hands.
  • Full compatibility with haptics.

Prerequisites

RequirementDetails
Meta XR SDK
OculusXR plugin with OpenXR backend
Hand tracking
Hand Tracking Support set to Controllers And Hands in Project Settings
Target device
Meta Quest 2, Quest Pro, Quest 3, or Quest 3S
To configure hand tracking support:
  1. In Unreal Editor, go to Project SettingsOculusXRMobile.
  2. Set Hand Tracking Support to Controllers And Hands.
This sets the following in DefaultEngine.ini:
[/Script/OculusXRHMD.OculusXRHMDRuntimeSettings]
XrApi=NativeOpenXR
HandTrackingSupport=ControllersAndHands
Note: There is no Project Settings checkbox to enable multimodal directly. You enable multimodal entirely at runtime through C++ or Blueprints.

Setup

Enable multimodal at runtime

Call SetSimultaneousHandsAndControllersEnabled(true) during BeginPlay or equivalent initialization. The underlying OpenXR extension (XR_META_simultaneous_hands_and_controllers) resets to disabled on each session creation, so you must re-enable it every time.
C++:
#include "OculusXRInputFunctionLibrary.h"

// Enable multimodal
bool bSuccess = UOculusXRInputFunctionLibrary::SetSimultaneousHandsAndControllersEnabled(true);

// Query whether multimodal is currently enabled
bool bEnabled = UOculusXRInputFunctionLibrary::IsSimultaneousHandsAndControllersEnabled();
Blueprints:
Both functions are available in the **OculusLibrary
HandTracking** category:
  • Set Simultaneous Hands And Controllers Enabled: Enables or disables multimodal. Returns true on success.
  • Is Simultaneous Hands And Controllers Enabled: Returns the current multimodal state.
Important: Multimodal resets to disabled on every new session. Place the enable call in your BeginPlay event to ensure it activates each time the app starts.
Expected result: After calling SetSimultaneousHandsAndControllersEnabled(true), the headset tracks both hands and controllers simultaneously. Setting the user’s controllers down does not disable hand tracking.

Configure controller-driven hand poses

When multimodal is enabled, you can control how hand meshes appear while the user holds a controller. Use SetControllerDrivenHandPoses to choose a pose style.
C++:
#include "OculusXRInputFunctionLibrary.h"

// Set hand pose style when holding a controller
UOculusXRInputFunctionLibrary::SetControllerDrivenHandPoses(
    EOculusXRControllerDrivenHandPoseTypes::Natural);

// Query the current pose type
EOculusXRControllerDrivenHandPoseTypes CurrentType =
    UOculusXRInputFunctionLibrary::GetControllerDrivenHandPoses();
Blueprints:
Both functions are available in the **OculusLibrary
Controller** category:
  • Set Controller Driven Hand Poses: Sets the pose type.
  • Get Controller Driven Hand Poses: Returns the current pose type.
Pose types:
ValueBehavior
None
Controllers do not generate hand poses
Natural
Controller button inputs generate a natural hand pose
Controller
Controller button inputs generate a hand pose that appears to hold a controller
Expected result: With Natural, the hand mesh reflects natural finger positions driven by controller button state. With Controller, the hand mesh shows fingers wrapped around a virtual controller shape.

Detect input state

To determine whether a given hand is currently using hand tracking or controller input, call IsHandInteractionProfile:
#include "OculusXRInputFunctionLibrary.h"

bool bLeftHandTracking = UOculusXRInputFunctionLibrary::IsHandInteractionProfile(
    EOculusXRHandType::HandLeft);
This function returns true if the current OpenXR interaction profile for the specified hand is a hand interaction profile, meaning hand tracking is the active input source for that hand.
In Blueprints, use Is Hand Interaction Profile from the **OculusLibrary
HandTracking** category.
Note: The OculusXR plugin does not expose a direct “is controller in hand” query. IsHandInteractionProfile indicates the active input source for a hand, not specifically whether the physical controller is in-hand versus detached.

Detached controller behavior

When multimodal is enabled, the OculusXR plugin handles detached controllers internally. If the system detects that a controller is not in the user’s hand, the plugin falls back from the controller tracking node to the hand tracking node for position and orientation data.
There are no separate motion source nodes for detached controllers in Unreal. The plugin falls back to hand tracking data internally.

OculusInputTest sample

The OculusInputTest sample demonstrates input handling with hand tracking and controller support. Its DefaultEngine.ini is configured with HandTrackingSupport=ControllersAndHands, and its Blueprint assets include interaction profile widgets that show multimodal input detection in practice.

Troubleshooting

Can I evaluate the feature on my headset without changing my code?

No.

Switching between controllers and hands doesn’t work in the sample.

Ensure that you’re running on Quest 2 (with Meta Quest Pro controllers paired to your headset), Quest 3, Quest 3S, or Quest Pro.
Did you find this page helpful?