Interaction SDK for Unreal Engine
Updated: Apr 15, 2026
The Meta XR Interaction SDK for Unreal Engine makes it easy for VR users to immersively interact with their virtual environment. With Interaction SDK, you can grab and scale objects, push buttons, navigate user interfaces, and more while using controllers or just your physical hands.
How does Interaction SDK work?
Interaction SDK offers many features to create an immersive XR experience, allowing users to:
- Grab and manipulate objects, including scaling and moving objects freely or along fixed axes
- Throw a held object
- Press buttons or interact with surfaces using rays
- Press buttons and scroll user interfaces using poking
Interaction SDK Plugin for Unreal Engine consists of:
- A closed source DLL (InteractionSDK Native) that powers ray, poke, grab, and pose-detection interactions
- Unreal ActorComponents and SceneComponents that wrap the native DLL
- Prebuilts that collect components into ready-to-use interactions
How do I set up Interaction SDK?
To install the downloaded plugin:
- Take note of the directory that you have installed Unreal Engine to, e.g.
C:\Epic Games\UE_5.4. This will be referred to as [UnrealDir] below. - In the
[UnrealDir]\Engine\Plugins directory, create a new directory called Marketplace. - Open the zip file you downloaded and copy the
MetaXRInteraction folder into the newly created Marketplace directory ([UnrealDir]\Engine\Plugins\Marketplace). - Run Unreal Editor.
- In the Level Editor toolbar, click Edit > Plugins. Search for Meta (searching for ‘Meta XR’ may not show all results).
- Enable both the Meta XR plugin (if present) and the Meta XR Interaction SDK plugin.
- Restart Unreal Editor for the changes to take effect.
To try Interaction SDK interactions without any setup, you can download the following:
- Meta Quest 2
- Meta Quest 3
- Meta Quest 3S
- Meta Quest Pro
Supported Unreal Engine versions
- Unreal Engine 5.4 and later
Interaction SDK can integrate with the
Meta XR plugin, or, in a restricted mode, with Unreal Engine’s built-in
Open XR HandTracking plugin. This means it will read hand and controller data from whichever of the two plugins are present in the project.
When deploying to non-Meta devices, Interaction SDK can read hand-tracking data from Unreal’s OpenXR HandTracking plugin as a fallback.
In addition to the
MetaXR feature comparison, there are some additional Interaction SDK functionality gaps when using the OpenXR path, as highlighted in the following table:
| Feature | MetaXR | OpenXR |
|---|
Procedural Controller Hand Animation | ✓ | - |
Hand Tracking Confidence | ✓ | - |
Input mapping for index pinch | ✓ | - |
Note:
- Hand Tracking Confidence - The MetaXR hand data source provides granular tracking confidence levels. The OpenXR path provides only binary hand validity rather than granular confidence.
- Input Mapping for Index Pinch - OpenXR does not currently support a direct input binding for the index pinch gesture. However, developers can utilize the PinchGrabStarted and PinchGrabFinished event delegates found in the IsdkHandFingerPinchGrabRecognizer. These events can be used to implement custom actions triggered by the index pinch gesture.
Interaction SDK plugin has an optional dependency on the following plugins:
Design guidelines are Meta’s human interface standards and design frameworks that help you create safe, user-oriented, and retainable immersive and passthrough user experiences.
- Input mappings: Understand how input mappings bridge modalities and interaction types.
- Input hierarchy: Understand the input hierarchy.
- Multimodality: Understand multimodality.
- Ray casting: Understand indirect interaction through ray casting.
- Touch: Understand direct interaction through touch.
- Grab: Understand grab interactions for object manipulation.
- Microgestures: Understand microgesture interactions.
- Input modalities: Explore the different input modalities.
- Head: Design and UX best practices for head input.
- Hands: Design and UX best practices for using hands.
- Controllers: Design and UX best practices for using controllers.
- Voice: Design and UX best practices for using voice.
- Peripherals: Design and UX best practices for using peripherals.