Use Capsense
Updated: Jul 15, 2024
Capsense generates realistic hand animations from controller input. When a user holds a Meta Quest controller, Capsense reads the controller’s capacitive touch sensors and button states to determine finger positions, then produces matching hand poses. These are the same hand visuals you see in the Horizon Home environment.
Capsense supports two styles of hand rendering:
- Natural hand poses: The hand is rendered without a visible controller, as if the user is interacting bare-handed. Finger positions are inferred from the controller’s touch sensors.
- Controller hand poses: The hand is rendered alongside a visible controller model. Capsense adjusts the hand shape to match each controller type. Supported controllers include Quest 2, Quest 3, Quest Pro, and all future device controllers.
- Benefit from best in class logical hand implementation and future improvements instead of investing in a custom implementation.
- When using Link on PC, pose data for controllers is unavailable when you’re not actively using them (such as when they’re lying on a table).
- Supported devices: Quest 2, Quest Pro, Quest 3 and all future devices.
- Fully compatible with Wide Motion Mode (WMM).
- Using Capsense for hands with body tracking through MSDK will both work simultaneously, but they have a different implementation of converting controller data to hands, so the position and orientation of joints will be slightly different.
To enable, disable, or control the use of capsense in Unreal Engine, use the Set Controller Driven Hand Poses Blueprint node.
The Controller Driven Hand Poses Blueprint Node.How can I confirm Capsense is running on my headset?
In your headset, you should see either hands instead of controllers or hands holding controllers. Also, hand pose data should be provided while the hands are active with the controllers.
Can I evaluate the feature on my headset without changing my code?
No, using Capsense requires some code changes.