Advanced Samples
Updated: Dec 20, 2024
This section explores advanced topics, such as integrating with other SDKs,
implementing locomotion controllers that blend tracked and controller-driven
movement, driving fitness apps, and interacting with virtual objects.
- ISDK Integration: Demonstrates integrating ISDK (Interaction SDK) with a
retargeted Movement character, enabling it to interact with scene objects
using hand gestures.
- Locomotion: Explains how to implement locomotion for a retargeted Movement
character using controller inputs, translation animations, and snap-turn.
- Body Tracking for Fitness: Shows how to visualize and analyze body poses
for fitness applications, focusing on user body shape.
- Networking Loopback: Demonstrates integrating body tracking into a
networking packet queue.
- High Fidelity with Hip Pinning: Enhances a high-fidelity model by adding
legs and a seated position, detailing interactions with virtual objects and
floor using a light IK solution.
This sample enables a retargeted character to interact with virtual objects,
using the blue robot character from previous scenes. It features two blue
robots—one controlled by the user’s body and the other mirroring the user. The
scene includes a virtual mug that can be picked up, triggering a
HandGrab
gesture, as demonstrated in the
ISDK Samples
scene HandGrabExamples. The scene also offers toggles for displaying input and
output skeletons, ensuring they match the ISDK’s hand bone positions.
Note: This method requires hand sizes to be similar to avoid mesh
penetration.
- Characters: A blue robot tracks user’s body behavior, while another
mirrors it.
- Visuals: Toggle options for bone visualizations, showing differences
between input and targeted skeletons.
- Interactions: A floating mug can be grabbed, activating specific
animations and skeleton adjustments.
- Controls: Buttons for toggling body tracking methods and calibrating
height.
Import the samples into your Unity Project from
Assets/Samples/Meta Movement/<version>/Advanced Samples/Scenes/MovementISDKIntegration.unity
.
Blue Robot Character Details - Skeleton Processing: Integrates ISDK hand poses into the skeleton,
updating each game frame.
- Hand Synthesis: Applies changes from ISDK to the input skeleton for both
hands.
- Tracking Fidelity: Options to switch between IOBT and basic 3-point
tracking.
- Height Calibration: A button to adjust the tracking height to 1.8 meters.
The Mug MSDK object is modeled after an ISDK sample in a scene called
HandGrabExamples
.
Mixing Locomotion and Body Tracking
This scene demonstrates mixing controller-based locomotion with body-tracked
animations, allowing players to move between areas using controllers and engage
in body tracking in specific zones. It features two robots, with options for
toggling mirrored character and embodiment states.
- Characters: Main and mirrored characters are on the same layer for
consistent lighting.
- Environment: A constrained area with a floor and walls.
- Controls: Locomotion managed by a
PlayerController
with various
components for movement and tracking.
Find the scene in
Assets/Samples/Meta Movement/<version>/Advanced Samples/Scenes/MovementLocomotion.unity
.
Body Tracking for Fitness
This sample uses body tracking to monitor a user’s exercise positions and
provide feedback on pose accuracy. It features two skeletons that compare user
alignment to a predefined target pose, with visual feedback and a counter for
successful alignments.
- Tracking: Real-time body pose adjustments based on headset data.
- UI: Elements for recording and comparing body poses.
- Feedback: Visual indicators and counters for pose matching.
The scene is available at
Assets/Samples/Meta Movement/<version>/Advanced Samples/Scenes/MovementBodyTrackingForFitness.unity
.
This sample demonstrates how to incorporate body tracking into a networking
packet queue. By utilizing the NetworkPoseRetargeter
and implementing the
INetworkPoseRetargeterBehaviour
, 1st-person movement can be transmitted as
binary data over the network. In this sample, this data is retrieved from a
local packet queue and applied to a 3rd-person character, enabling seamless
movement replication.
- Characters: A 1st-person character acts as the network host sending data.
The data gets written to a local queue.
- A 3rd-person character is placed facing the 1st-person character acting as
the network client receiving data. The data is retrieved from the local
queue.
The scene is available at
Assets/Samples/Meta Movement/<version>/Advanced Samples/Scenes/MovementNetworking.unity
.
High Fidelity with Hip Pinning
This scene features a high-fidelity character with hip pinning, allowing
realistic seated interactions within a virtual environment. It includes
calibration for chair height and leg grounding.
- Characters: A non-mirrored character tracks user movements, with a
mirrored counterpart.
- Lighting: Main and mirrored characters are differently illuminated.
- Interactions: Hip pinning and leg grounding for realistic seated postures.
Download the Meta Movement package from
Oculus Samples GitHub repo
and locate the scene in
Assets/Samples/Meta Movement/<version>/Advanced Samples/Scenes/MovementHipPinning.unity
.
Hip Pinning Character Details - Body Tracking: Essential components for tracking and animation
adjustments.
- Constraints: Details on hip pinning and leg grounding setups.