Comprehensive Interaction Rig
Updated: Nov 3, 2025
What is the Comprehensive Interaction Rig
In Interaction SDK, the rig is a predefined collection of GameObjects that enable you to see your virtual environment and initiate actions, like a grab, a teleport, or a poke. The rig is contained in a prefab called OVRInteractionComprehensive, it requires a working camera rig, and will add support for hands, controllers, and controller driven hands to your scene.
To enhance the user experience of your app and broaden its reach, check out the
Design guidelines section.
The comprehensive interaction rig is available in two versions: OVR and UnityXR. The OVR interaction rig is built on the Meta XR Core SDK which gives you access to the latest and greatest Quest features. The UnityXR interaction rig is built on Unity XR which gives you all of Unity XR’s benefits such as portability. The table below provides a more detailed look at the differences between the two:
| | UnityXR Rig | OVR Rig |
|---|
Controller Tracking | | |
Hand Tracking | | |
Controller-Driven Hands | Custom Hand Animated Poses | System Poses - “Capsense” |
Raycast | | |
Poke | | |
Pinch Grab | | |
Palm Grab | | |
Ray Grab | | |
Grab Use | | |
Touch Hand Grab | | |
Object Transformation | | |
Controller Teleport Locomotion | | |
Hand Teleport Locomotion | L-Gesture | L-Gesture, Microgesture |
Controller Sliding Locomotion | | |
Microgesture Teleport Locomotion | | |
Distance Grab | | |
Throw | | |
How do I add the Comprehensive Interaction Rig to my scene?

Add OVR Rig
Add the OVR Interaction Rig to your scene 
Add UnityXR Rig
Add the UnityXR Rig to your scene 
Comprehensive Rig Example
Scene demonstrating the comprehensive interaction rig and basic interactions 
Concurrent Hands and Controllers
Scene demonstrating the using hands and controllers simultaneously How does the Comprehensive Interaction Rig work?
The Comprehensive Interaction Rig is a significant improvement over the previous rig, offering several key differences:
Modularity: The rig is divided in three major sections: Data Sources -> Interactions -> Visuals, each module can be swapped without affecting the rest. Making the Interactions not be tied to the Data Sources or Visuals means that it can be reused in different platforms.
Smaller Hierarchy: The rig reduces the amount of GameObjects needed by reusing the Inputs and Visuals, for example there is no need to have two left-hand visuals for different inputs. Or no need to duplicate the entire Interactors hierarchy to support Hands, Controllers, and ControllerHands.
Delete what you don’t need: Thanks to the less coupled hierarchy and new Unity 2022+ features. Even if the rig is presented in full form, users can delete any unneeded Interactors and later recover them via prefab overrides.
Interactions Mirroring: The Comprehensive Interaction Rig simplifies the development process by providing a common, handedness agnostic, prefab for Left and Right Interactions. That way changes can be copied instantly to either the Left or the Right Interactors.
Full Locomotion: It supports Teleport and Smooth Locomotion by default, as well as comfort and Physics options.
If you have downloaded the
Interaction SDK Samples, you can test the
Comprehensive Interaction Rig in the
ComprehensiveRigExamples Unity scene.
Data sources are the input of the interaction rig. They provide input data from various devices, such as HMD, hand controllers, and hands. This data is then transformed into a standardized format that can be used by the rest of the interaction rig. For example, if you want to switch from using an OVR camera rig to a UnityXR one, you can simply replace the data sources and reference them to the Interactions entry point.
At the root of the Interactions gameobject there are the Input sources available from the data sources: the HMD, the left hand, and the left controller for the Left Interactions and the HMD, the right hand, and the right controller for the Right Interactions. If you replace the DataSources, ensure they are re-wired correctly to the LeftInteractors and RightInteractions GameObjects.
Interactions are the core of the interaction rig. They define how the user can interact with the virtual environment. Most interactions are platform-independent, meaning they can be used with any type of device or platform.
In the context of the interaction rig, interactors are used to handle user input from the Data Sources, perform actions over the environment and then also modify the visuals of the devices.
You can add interactors to the Interactions / Interactors GameObject. Don’t forget to also reference them in the Interactions / Interactors . Interactor Group if you need to stablish any priority between them.
Depending on whether the Interactors are using Hands, Controllers and
Controller Driven Hands they might reference just the Hand, the Controller or both inputs. Since the rig can support all input modalities by default, it offers a set of optional GameObjects that will be enabled and disabled automatically depending on what inputs are available. Simply place your Interactor in the relevant
Interactions / Interactors subfolder, or place it directly under
Interactors if you don’t care about this feature.
- Hand There is a Hand available, don’t care about the Hand. This one is particularly useful for Poke as it usually prefers to use the Index finger if there is one available.
- Controller There is a Controller available, don’t care about the Hand. This one is particularly useful for Ray and Locomotion as it usually prefers to use the controller if there is one available.
- Hand and no controller for pure hand tracking interactions, such as Hand Teleport or TouchGrab.
- Controller and no hand for pure controller interactions, such as Grab (without Hand visual).
- Controller and Hand for hybrid interactions such Controller+Hand Grab, that uses the Triggers to drive the selection but cares about the visual shape of the hand.
Each Interactions structure is handedness agnostic, and it’s handedness will be provided by the input. This means that it is possible to have two identical structures for the Left and Right hands if desired and make use of the Unity Prefab override tools to either copy the changes from one hand to the other, or highlight the differences.
If you created your rig via Quick Actions and selected Generate as Editable Copy there should be a prefab variant present at the Prefab path. You can use this prefab variant to:
Apply changes from one hand to the opposite: Simply make the change in either of the hands Interactions and Apply as Override in Prefab “ComprehensiveInteractions”.

Highlight differences from the hands to the common rig: Select the Interactions gameobject and click
Overrides to highlight the differences between this Interactions hierarchy and the opposite.

Revert the changes to the original rig: If you make a mistake and even submitted it to your Prefab Variant copy. You can always select the Overrides of your prefab variant and revert them to restore it to the original values.
Synthetic Hands (and Controllers) are steps in the Data Stack that have some information being override by the Interactors. For example HandGrab might modify the finger so they wrap around a virtual object and then a Poke Interactor might override the input HandData so the finger does not move past the button. It is important to note that sometimes the order matters, and the Poke might want to wait for the HandGrab pass to understand the virtual position of the wrapping index finger for poking.
Each Interactions module provides three chained Synthetic Hands, so you can wire your interactors to any desired level. The Visual references the final step of that chain.
Visuals are an output of the interaction rig. They are responsible for rendering the input data from devices in a way that is meaningful to the user. This can include things like hand meshes or controller meshes. Visuals, specially controllers, can be platform-specific.
In the Interaction Rig, the Visuals reference only the output Synthetic Hand (or Controller) of the Interactions section. Meaning it will adopt the form required by the Interactors after they have run. Since all Interactions overwrite the same DataStack, you will typically only need one Visual per Input device.
The Locomotion Module is at the same level as the DataSource, Visuals and Interactions.
It uses several interactors and locomotion broadcasters from the Interactions layer that are then redirected using LocomotionEventsConnection components. It then proceeds to move the Character and the Camera Rig based on the received inputs.
Design guidelines are Meta’s human interface standards that assist developers in creating user experiences. Refer to the following resources to begin, and explore additional design guidelines in subsequent Unity documents.
- Input modalities: Learn about the different input modalities.
- Head: Learn about design and UX best practices for using a head input.
- Hands: Learn about design and UX best practices for using hands.
- Controllers: Learn about design and UX best practices for using controllers.
- Voice: Learn about design and UX best practices for using voice.
- Peripherals: Learn about design and UX best practices for using peripherals.