Develop

InteractionSDK sample overview

Updated: May 11, 2026

Overview

This sample demonstrates the five core interaction modalities of the Meta XR Interaction SDK for Unreal Engine: poke, ray, hand grab, distance grab, and transformer (two-handed). Built as a UE 5.6 C++ project, it uses a level streaming architecture with a persistent start level, four demo levels, and two standalone experiences. The project shows how to configure interaction components, switch between controller and hand input at runtime, and build complex grab behaviors.

What you will learn

  • Configure poke interactions using UIsdkInteractableWidgetComponent
  • Set up ray-based interactions with hover audio feedback
  • Implement hand grab with pre-authored grab pose data assets
  • Build distance grab mechanics with multiple grab strategies
  • Create two-handed transformer interactions for object manipulation
  • Switch between controller and hand input at runtime using ISDK_SetControllerHandBehavior

Requirements

  • Meta Quest 2, Quest 3, or Quest 3S
  • Unreal Engine 5.6 or later
  • Meta XR Interaction SDK plugin installed
For setup instructions, see the Meta Quest Developer Hub documentation.

Get started

Clone the repository from GitHub, open the project in Unreal Engine 5.6, and build for Android. The project uses level streaming—StartLevel is the persistent level that loads demo levels on demand. For detailed build and configuration steps, see the project README.

Explore the sample

LevelWhat it demonstratesKey concepts
StartLevel (persistent)
Level streaming hub, runtime input switching
Level streaming, ISDK_SetControllerHandBehavior
PokeExamples
Poke interactions with UI widgets
UIsdkInteractableWidgetComponent, bCreatePokeInteractable=true
RayExamples
Ray-based interactions with hover feedback
bCreateRayInteractable=true, hover audio
TransformerExamples
Two-handed object manipulation
18 grab pose data assets for Mug object
DistanceGrabExamples
Remote object grabbing with multiple strategies
PullToHand, RelativeToPointer, ManipulateInPlace
ThrowExamples (standalone)
Physics-based throwing mechanics
Velocity estimation, release detection
HandPoseStudio (standalone)
Hand pose authoring tool
Pose recording, data asset creation

Runtime behavior

When running on a Meta Quest device, the StartLevel persistent map presents a menu for loading demo levels via level streaming. Each demo level showcases a specific interaction modality. PokeExamples demonstrates finger-based UI interaction where UIsdkInteractableWidgetComponent with bCreatePokeInteractable=true enables poke detection on UMG widgets. RayExamples adds hover audio feedback when the ray intersects interactable surfaces. TransformerExamples uses 18 grab pose data assets for a Mug object to demonstrate two-handed manipulation. DistanceGrabExamples offers three strategies: PullToHand brings the object to your hand, RelativeToPointer maintains offset from the pointer, and ManipulateInPlace moves the object without pulling. Runtime switching between controllers and hands uses ISDK_SetControllerHandBehavior.

Key concepts

Poke interactions

UIsdkInteractableWidgetComponent enables poke-based interaction on UMG widgets:
// Add UIsdkInteractableWidgetComponent to any widget actor
// Set bCreatePokeInteractable = true to enable poke detection
// The component automatically creates poke interactable geometry
// matching the widget bounds
UPROPERTY(EditAnywhere)
bool bCreatePokeInteractable = true;

Ray interactions

Ray interactables provide far-field selection with optional audio feedback:
// Enable ray interaction on a component
// bCreateRayInteractable = true generates ray hit surfaces
// Hover audio plays when ray intersects the interactable
UPROPERTY(EditAnywhere)
bool bCreateRayInteractable = true;

Grab pose data assets

TransformerExamples uses pre-authored grab pose data assets to define how hands conform to objects:
// 18 grab pose data assets define hand positions for the Mug object
// Each asset stores bone transforms for a specific grab approach angle
// The system selects the closest matching pose at grab time

Distance grab strategies

Three distance grab strategies provide different remote manipulation behaviors:
// PullToHand: object flies to the grabbing hand
// RelativeToPointer: object follows pointer offset, maintaining relative position
// ManipulateInPlace: object transforms without translating toward hand
enum class EDistanceGrabStrategy { PullToHand, RelativeToPointer, ManipulateInPlace };

Runtime input switching

ISDK_SetControllerHandBehavior enables seamless switching between controllers and hands:
// Call ISDK_SetControllerHandBehavior to switch input mode at runtime
// The system updates all active interactors to use the new input source
// No level reload required—switching is immediate

Extend the sample

  • Author new grab pose data assets using the HandPoseStudio standalone level.
  • Add custom distance grab strategies by extending the grab strategy interface.
  • Combine poke and ray interactions on the same widget for multi-modal input.
  • Create new demo levels and register them with the StartLevel streaming system.