Getting Started with Meta XR Interactions and Unity XR
Updated: Feb 10, 2025
Meta XR Interaction SDK Essentials provides the core implementations of all the Meta XR interaction models, along with necessary shaders, materials, and prefabs.
Import and install packages
Select Add to My Assets to add the package to your Unity account’s assets.
Open the Unity project where you want to use Interaction SDK.
Select Window > Package Manager to view your available packages.
The Package Manager window appears.
At the top of the Package Manager window, open the Packages: drop-down menu, and select My Assets.
Select the Meta XR Interaction SDK Essentials package, and then select Install at the top right of the window to add the package to your project.
On the left of the Package Manager window, select Unity Registry.
Search for “XR Hands”, select the
XR Hands package, and then select
Install to add the
Unity XR Hands package to your project.
If you are prompted to restart the Unity Editor, select Yes.
At the top of the Package Manager window, open the Packages: drop-down menu, and select In Project. You should see Meta XR Interaction SDK Essentials and XR Hands both listed.
From the list packages in your project, select XR Plugin Management, and, in Version History, make sure that your project is using at least version 4.5.0. If it is not, select Update next to the latest 4.5.x version to update the XR Plugin Management package.
With Interaction SDK and OpenXR installed through XR Hand dependencies, OpenXR must be enabled.
Navigate to Edit > Project Settings and select XR Plugin-Management.
- Navigate to XR Plugin-Management > OpenXR.
- Under Interaction Profiles enable your intended profiles, i.e., Oculus Touch Controller Profile and Meta Quest Touch Pro Controller Profile
- Under OpenXR Feature Groups enable:
- Hand Tracking Subsystem
- Meta Hand Tracking Aim
- Meta Quest Support (Android only)
Navigate to XR Plugin-Management > Project Validation.
The Project Validation tool optimizes project settings. The tool applies the required settings for the configured dependencies.
In Interaction SDK, the rig is a predefined collection of GameObjects that enable you to see your virtual environment and initiate actions, like a grab, a teleport, or a poke. For Unity XR the rig is contained in a prefab called UnityXRInteractionComprehensive, which automatically adds a ready-to-use camera, hands, and controllers to your scene.
Instead of manually adding these prefabs to the scene, using Interaction SDK “Quick Actions” is recommended.
In your Unity scene, delete the default Main Camera and Global Volume GameObjects from the Hierarchy, since the SDK uses its own camera rig.
The Hierarchy should now be empty except for the Directional Light GameObject.
Right click on the Hierarchy and select the Interaction SDK > Add UnityXR Interaction Rig Quick Action. This will add the rig to your scene.
In the Hierarchy, add a cube GameObject by right-clicking and selecting 3D Object > Cube.
In the Hierarchy, select the Cube, and in the Inspector, under Transform, change the Position to [0, 1, 0.5] and the Scale to [0.25, 0.25, 0.25].
Right-click the Cube, and select Interaction SDK > Add Grab Interaction.
The Grab Wizard window appears.
In the Grab Wizard window, under Required Components, select Fix All to configure the Cube component for interaction.
Select Create to add the Grab Interaction to the Cube.
Use
Link to test your project.
- Open the Link desktop application on your computer.
- Put on your headset, and, when prompted, enable Link.
- On your development machine, in Unity Editor, select the Play button.
- In your headset, you can interact with the 3D object in your app.
Interaction SDK was built to run on the Meta Core SDK but now also supports Unity XR. Unity XR cannot access all Meta devices features though. To take full advantage of Meta device features the Core SDK is necessary. Some Interaction SDK editor tooling has Core SDK dependencies and will not be available if you are only using Unity XR.
There is a single comprehensive sample scene available in the Unity XR package samples, but this does not represent a limitation for how Unity XR can be integrated. For many Interaction SDK Core SDK Sample scenes, if the hand and HMD data sources were swapped to a Unity XR source they would work just the same.
FromUnityXRHandDataSource and
FromUnityXRHmdDataSource are Monobehaviors which take the OpenXR data provided through
XR Hands or the
XROrigin and translates it into the Core SDK data format the ISDK expects.
OpenXR vs Core SDK Hand Skeleton For those accustomed to dealing with the OpenXR skeleton format, it may be important to be aware of the differences between these hand skeletons.
- The Interaction SDK Hand skeleton is missing the following joints as compared to OpenXR:
- Index Metacarpal
- Middle Metacarpal
- Ring Metacarpal
- Palm
- The Interaction SDK Hand skeleton has the following extra joints as compared to OpenXR:
- Thumb Trapezium
- Forearm Stub
- The orientations of the hand joints differ between the two hand skeletons.
- In Interaction SDK, the left hand joint orientations are mirrored from the right hand, whereas the OpenXR hand has matching orientations between left and right hands.
- The joint orientations in OpenXR have the Z axis parallel to the finger bones, while the ISDK hand skeleton has the X axis parallel to the finger bones, pointing forward on the right hand and backwards on the left.
Add some GameObjects and make them interactable with
Quick Actions.