Develop
Develop
Select your platform

Use Interaction SDK with Unity XR

Updated: Aug 7, 2025
This tutorial explains how to install and set up Meta XR Interactions in your Unity project using Unity OpenXR, Unity XR Hands, and Meta XR Interaction SDK Essentials.
Meta XR Interaction SDK Essentials provides the core implementations of all the Meta XR interaction models, along with necessary shaders, materials, and prefabs.
Note: Some features are only supported for Meta XR Interaction SDK with Meta XR Core SDK. Meta XR Interaction SDK Essentials with Unity XR does not support the full set of Interaction features, but it does offer the possibility of cross platform support. To learn how to get started with Interaction SDK with Meta XR Core SDK, check out Getting Started with Interaction SDK.

Project setup

With Interaction SDK and OpenXR installed through XR Hand dependencies, OpenXR must be enabled.
  1. Navigate to Edit > Project Settings and select XR Plugin-Management.
  2. Navigate to XR Plugin-Management > OpenXR.
    • Under Interaction Profiles enable your intended profiles, i.e., Oculus Touch Controller Profile and Meta Quest Touch Pro Controller Profile
    • Under OpenXR Feature Groups enable:
      • Hand Tracking Subsystem
      • Meta Hand Tracking Aim
      • Meta Quest Support (Android only)
  3. Navigate to XR Plugin-Management > Project Validation.
    The Project Validation tool optimizes project settings. The tool applies the required settings for the configured dependencies.

Add the rig

In Interaction SDK, the rig is a predefined collection of GameObjects that enable you to see your virtual environment and initiate actions, like a grab, a teleport, or a poke. The rig is contained in a prefab called UnityXRInteractionComprehensive, it requires a working camera rig, and will add support for hands, controllers, and controller driven hands to your scene.
Instead of manually adding these prefabs to the scene, using Interaction SDK “Quick Actions” is recommended.
  1. Delete the default Main Camera if it exists since since Interaction SDK uses its own camera rig.
  2. Right click on the Hierarchy and select the Interaction SDK > Add UnityXR Interaction Rig Quick Action.
    Using the UnityXR Interaction Rig Wizard
  3. If you have a UnityXRCameraRig in the scene, it will appear referenced in the wizard, click Fix if there is no camera rig so the wizard creates one.
    Fix All Button
  4. Click Create to add the UnityXR Interaction Rig to the scene.
    Components for the UnityXR Interaction Rig are added
Use Link to test your project.
  1. Open the Link desktop application on your computer.
  2. Put on your headset, and, when prompted, enable Link.
  3. On your development machine, in Unity Editor, select the Play button.
  4. In your headset, you can you can see your hands in the app.

Test your Interaction with an APK

Build your project into an .apk to test your project.
  1. Make sure your headset is connected to your development machine.
  2. In Unity Editor, select File > Build Settings. Add your scene to the Scenes In Build list by dragging it from the Project panel or clicking Add Open Scenes.
  3. Click Build and Run to generate an .apk and run it on your headset. In the File Explorer that opens, select a location to save the .apk to and give it a name. The build process may take a few minutes.

Key differences between ISDK with Meta XR Core SDK and with Unity XR

Interaction SDK was built to run on the Meta Core SDK but now also supports Unity XR. Unity XR cannot access all Meta devices features though. To take full advantage of Meta device features the Core SDK is necessary. Some Interaction SDK editor tooling has Core SDK dependencies and will not be available if you are only using Unity XR.

Data Sources

There is a single comprehensive sample scene available in the Unity XR package samples, but this does not represent a limitation for how Unity XR can be integrated. For many Interaction SDK Core SDK Sample scenes, if the hand and HMD data sources were swapped to a Unity XR source they would work just the same.
FromUnityXRHandDataSource and FromUnityXRHmdDataSource are Monobehaviors which take the OpenXR data provided through XR Hands or the XROrigin and translates it into the Core SDK data format the ISDK expects.

Learn more

To learn about the key concepts in Interaction SDK, see the Architecture Overview.

Next steps

Add some GameObjects and make them interactable with Quick Actions.
Did you find this page helpful?
Thumbs up icon
Thumbs down icon