Oculus Input Test sample overview
Updated: May 11, 2026
This sample provides 17 diagnostic UI panels and 67 Enhanced Input Actions that demonstrate the full range of Meta Quest input capabilities. It covers controller inputs, hand tracking, capacitive touch, microgestures, force-sensing triggers, and system-level interaction profiles. The 3D Widget Panel architecture makes it easy to explore each input category interactively.
- Configure 67 Enhanced Input Actions for controller and hand tracking inputs
- Use ControllersAndHands mode for simultaneous controller and hand input
- Build 3D Widget Panel UI architecture with paired Actor BP and Widget BP components
- Detect controller types using OpenXR interaction profiles
- Access advanced inputs including capacitive touch, microgestures, and force-sensing triggers
- Visualize input state across 17 diagnostic panels
- Meta Quest 2, Quest 3, or Quest 3S
- Unreal Engine 5.5 with the
MetaXR plugin
Clone or download the sample from the GitHub repository. Open the project in Unreal Engine 5.5 with the MetaXR plugin installed. Build and deploy to your Meta Quest device. The sample presents a menu of diagnostic panels organized by input category. Point and select panels to explore each input type. Use both controllers and hands simultaneously in ControllersAndHands mode to test dual-input scenarios.
| File / Scene | What it demonstrates | Key concepts |
|---|
Controller: Gamepad panel | Traditional gamepad-style inputs | Button presses, thumbstick axes |
Controller: MotionController panel | 6DoF controller tracking and buttons | Position, rotation, trigger, grip |
Hand Tracking: Pinch panel | Pinch detection per finger | Pinch state, pinch strength |
Hand Tracking: Strength panel | Grip and pinch force values | Analog hand tracking values |
Advanced: CapTouch panel | Capacitive touch sensing on buttons | Near-touch and touch states |
Advanced: Microgesture panel | Fine-grained finger gestures | Microgesture detection API |
Advanced: Touchpad panel | Touchpad input on supported controllers | 2D axis, touch state |
System: InteractionProfile panel | OpenXR controller type detection | Interaction profile strings |
System: HMD panel | Headset tracking and mounted state | HMD pose, proximity sensor |
System: Performance panel | Frame timing and GPU/CPU metrics | Performance monitoring |
3D Widget Panel (architecture) | Actor BP + Widget BP paired panels | Reusable 3D UI pattern |
The sample displays a circular arrangement of 17 diagnostic panels in a passthrough environment. Each panel updates in real time as the user interacts with controllers or hands. The ControllersAndHands mode enables simultaneous tracking of both input modalities, showing how controllers and hands can coexist.
Input categories are organized as:
- Controller: Gamepad and MotionController panels showing button, trigger, grip, and thumbstick state
- Hand Tracking: Pinch and Strength panels showing per-finger pinch state and analog force values
- Advanced: CapTouch, Microgesture, and Touchpad panels for specialized input hardware
- System: InteractionProfile, HMD, and Performance panels for device-level information
Each diagnostic panel consists of a paired Actor Blueprint and Widget Blueprint:
`BP_Panel_`[Category] (Actor BP)
├── Contains 3D widget component
├── Handles placement and interaction
└── References WBP_Panel_[Category] (Widget BP)
├── Displays real-time input values
└── Updates per-frame from Enhanced Input Actions
The sample defines 67 Enhanced Input Actions across all input categories. Each action maps to a specific input signal:
// Example: Controller trigger with analog value
IA_Trigger_Left // Float axis: 0.0 to 1.0
IA_Trigger_Right // Float axis: 0.0 to 1.0
// Example: Hand tracking pinch
IA_Pinch_Index_Left // Bool: pinch state
IA_PinchStrength_Index_Left // Float: pinch force
OpenXR interaction profiles
The sample uses OpenXR interaction profiles to detect which controller type is connected at runtime, enabling controller-specific UI and input mapping without hardcoding device types.
- Add new diagnostic panels for custom input actions in your project.
- Use the ControllersAndHands mode pattern for applications that need both input modalities.
- Adapt the 3D Widget Panel architecture for in-game debug menus or developer tools.
- Combine microgesture detection with hand tracking for gesture-based interaction systems.