Develop

Unreal Locomotion sample overview

Updated: May 11, 2026

Overview

This sample demonstrates six VR locomotion methods and object interaction mechanics in Unreal Engine for Meta Quest. It includes Point and Teleport (with arc preview and facing direction), avatar-based teleportation, stepped translation and rotation, grab and drag, arm swinging, and dual stick walking. The project also demonstrates polymorphic object interaction through PickupActorInterface with one-handed and two-handed objects.

What you will learn

  • Implement six different VR locomotion methods with runtime switching
  • Build teleportation with arc preview and facing direction indicator
  • Create grab and drag locomotion using hand position deltas
  • Configure arm swinging locomotion from controller velocity
  • Implement polymorphic object interactions with PickupActorInterface
  • Set up 24 Enhanced Input Actions with capacitive touch support

Requirements

  • Meta Quest 2, Quest 3, or Quest 3S
  • Unreal Engine configured for Meta Quest development
For setup instructions, see the Meta Quest Developer Hub documentation.

Get started

Clone the repository from GitHub, open the project in Unreal Engine, and build for Android. The project includes LocomotionSelectionUI for switching between locomotion methods at runtime. For detailed build and configuration steps, see the project README.

Explore the sample

Locomotion methodWhat it demonstratesKey concepts
Point and Teleport
Arc-based teleportation with facing direction
Arc preview, destination indicator, rotation selection
Point and Teleport with avatar
Mannequin walks to destination
Avatar animation, pathfinding to teleport target
Stepped Translation/Rotation
Discrete movement increments
Fixed step distances, snap rotation
Grab and Drag
Hand-anchored world movement
Position delta calculation, world-space translation
Arm Swinging
Controller velocity-based locomotion
Velocity sampling, direction from arm swing
Dual Stick Walking
Continuous thumbstick movement
Left stick translate, right stick rotate
Interactive objectWhat it demonstratesKey concepts
Blue cubes
One-hand grab and throw
PickupActorInterface, single-hand physics
Gun
Two-hand aim
Bimanual aiming, forward vector from two grip points
Pole
Two-hand alignment
Object aligns between two grip positions
Bow and arrow
Draw-release with progressive haptics
Draw distance maps to haptic intensity, release to fire

Runtime behavior

When running on a Meta Quest device, LocomotionSelectionUI presents a menu for switching between the six locomotion methods at runtime. Point and Teleport displays an arc preview with a facing direction indicator at the landing point. The avatar variant spawns a mannequin that walks to the destination before transitioning the player. Grab and Drag anchors the world to your hand position and translates based on movement deltas. Arm Swinging samples controller velocity to drive forward movement in the swing direction. Objects implement PickupActorInterface for polymorphic interaction—blue cubes use one-hand grab, the gun and pole use two-hand mechanics, and the bow uses a draw-release mechanic with progressive haptic feedback that intensifies as you pull the string further.
The project uses 24 Enhanced Input Actions with capacitive touch support for nuanced finger detection. Rendering optimizations include instanced stereo, multi-view, dynamic foveated rendering, and disabled post-processing.

Key concepts

PickupActorInterface

PickupActorInterface provides polymorphic object interaction through a shared interface:
// All interactive objects implement PickupActorInterface
// This enables uniform grab/release handling regardless of object type
// One-hand objects (cubes): standard grab and throw
// Two-hand objects (gun, pole): secondary hand modifies behavior
class IPickupActorInterface
{
    virtual void OnGrab(`UMotionControllerComponent`* Hand) = 0;
    virtual void OnRelease(`UMotionControllerComponent`* Hand) = 0;
};

Teleportation with arc preview

Point and Teleport uses a projectile arc to preview the landing location:
// Arc preview traces a parabolic path from the controller
// Landing point shows a facing direction indicator
// Player rotation is set based on thumbstick direction at release
// Valid destinations are filtered by NavMesh or surface type

Arm swinging locomotion

Arm Swinging samples controller velocity over multiple frames:
// Controller velocity is sampled each frame
// Movement direction is derived from the swing vector
// Speed scales with swing intensity
// Both controllers contribute to the final movement vector

Progressive haptics for bow interaction

The bow uses draw distance to drive haptic intensity:
// Draw distance maps linearly to haptic amplitude
// Minimum draw: no haptics
// Maximum draw: full haptic intensity
// Release triggers arrow launch with velocity proportional to draw
float HapticIntensity = FMath::GetMappedRangeValueClamped(
    FVector2D(MinDraw, MaxDraw), FVector2D(0.0f, 1.0f), CurrentDraw);

Runtime locomotion switching

LocomotionSelectionUI enables runtime switching between all six methods:
// LocomotionSelectionUI presents an in-VR menu
// Selecting a method activates its component and deactivates others
// No level reload required—switching is immediate
// 24 Enhanced Input Actions with cap touch handle all input modes

Extend the sample

  • Add new locomotion methods by implementing the locomotion component interface.
  • Create additional interactive objects by implementing PickupActorInterface.
  • Adjust arm swinging sensitivity and dead zones for different gameplay styles.
  • Combine locomotion methods (for example, teleport with continuous rotation).