Design

Ray casting

Updated: Mar 13, 2026
Explore how to interact with interactables that are out of reach. These guidelines can help ensure an empowering experience for your users.

Ray interaction using a virtual ray extending from the user's hand to interact with distant objects.

Usage

Direct interaction (touch, grab) suits close-range tasks because it feels intuitive. Indirect interactions, or interactions that use a raycast, enable users to select, manipulate, or trigger virtual content that is positioned outside of their immediate physical reach.
For expansive environments, indirect interactions work better. These usually use a raycast; a line projected from the input source that detects distant objects and provides a hover effect.
Indirect methods also help maintain ergonomic comfort. Use indirect interactions to engage with content beyond arm’s reach.
illstration of hand usage with raycasting.

Use indirect interactions to engage with content beyond arm's reach.

Anatomy

These are the different parts, characteristics, and frequently used terminologies that you should be familiar with:
Image of raycasting anatomy and parts.
1. Targeting
For targeting distant interactables, the following input modalities can be used: Gaze, Hands, Controllers, and Head.
2. Selection
For selecting distant interactables, the following input modalities can be used: Hands, Controllers, Voice, and HMD buttons.
3. Ray or raycast
A line projected from the input modality to the target. Depending on the input method, a visible ray or line extends from the hand or controller toward the target.
4. Cursor
A visual indicator at the ray's intersection with an object, showing the exact hit point. The cursor can appear as a discrete shape (such as a dot or ring) or as a contextual hover effect on the target itself.
Indirect interaction
An interaction pattern where the user engages with an object from a distance rather than through direct contact.

Variants

There are different ways to combine targeting with selection methods. The available modes depend on device capabilities and context.

Ray casting

Illustration of ray casting with hand and controller.
The primary indirect interaction method. A virtual ray extends from the user's hand or controller to interact with distant objects.

Targeting : Hand, Controllers
Selection : Hands, Controllers

Gaze targeting

Illustration of gaze targeting with eye tracking.
Pinch for selection On devices with eye tracking, gaze can be used for targeting combined with a hand pinch gesture or voice for selection. When a user picks up controllers, the system automatically switches to standard ray casting.

Targeting : Gaze
Selection : Hands, Voice

Head ray (fallback)

Illustration of head ray fallback targeting.
If the headset is unable to receive any external input, the system falls back to using the Head ray (centered in the user's field of view) for aiming and the device's volume/HMD buttons for selection.

Targeting : Head
Selection : HMD buttons
Note: Beyond its role as a fallback, the Head ray is a vital tool for development and debugging. It can be used for input suppression, input focus, and it allows developers to test gaze-based interactions on devices that do not have eye-tracking hardware.

Types

Indirect interactions are categorized by how a user triggers a response or manipulates a distant object. These mechanisms define the behavior of the virtual content once it is engaged.

Select

The most fundamental form of indirect interaction. Selection allows users to activate buttons, toggle switches, or engage with objects without physical contact.
InputAction
Hand
Index-thumb pinch
Controllers
Trigger button
Voice
“Select” (voice command)

Transform

Users can manipulate an object’s properties by interacting with its bounding box handles. This allows for precise adjustments to the object’s position in space, rotation, or scale.
InputAction
Hand
Index-thumb pinch-and-hold
Controllers
Trigger-hold button
Voice
“Select” (voice command)

Distance grab: Interactable to hand

Upon selection, the item moves rapidly toward the user’s hand and attaches to it. This is ideal for tools or items the user needs to inspect closely.

Distance grab interaction showing three methods: interactable to hand, anchor at hand, and hand to interactable.

Distance grab: Anchor at hand

This allows the user to move and “steer” an item from a distance. The object maintains its original distance from the user but follows the movement and rotation of the user’s hand.

Distance grab: Hand to interactable

Instead of moving the object, a virtual representation of the hand (a “ghost hand”) appears at the object’s distant position. This allows for precise, 1:1 manipulation of an object exactly where it sits in the environment.

Next steps

Designing experiences

  • Input mappings: Understand how targeting and selection work across input methods.
  • Input hierarchy: Learn how the system prioritizes input modalities and fallback logic.
  • Touch: Explore direct interaction through touch.
  • Grab: Explore object manipulation through grab.
  • Hands: Learn about hands as an input modality for ray casting.
  • Controllers: Learn about controllers as an input modality for ray casting.

Developing experiences

Meta Spatial SDK

Unity

Unreal

Did you find this page helpful?