Design
Design

Controllers

Updated: Mar 13, 2026
Explore the use of controllers to improve user experiences in immersive experiences. This page delves into controllers as an input method, outlining their benefits, challenges, capabilities, best practices, operational mechanics, and design principles.

Usage

Controllers are handheld input devices that can be used to extend the input capabilities of hands.
Equipped with buttons, joysticks, and various other controls, these devices facilitate a wider range of interactions and allow for more precise interactions.

Terminology

Here are the various components, key characteristics, and commonly used terms you should know:
Controller terminology
1. Trigger buttons
  • Located on the front of the controller.
  • Primarily used for selection and triggering actions.
  • They are pressure-sensitive, ensuring a responsive experience.
  • Quest Pro Controllers feature additional touch sensitivity, offering touch motion interaction options.
2. Y, X, A and B buttons
  • These are the face buttons on the controller.
  • These buttons can be used for triggering different application/context actions, depending on the design.
3. Thumbstick
  • Each controller features a thumbstick.
  • It enables smooth manipulation, such as scrolling in a UI or navigating within a virtual environment.
  • The thumbstick can also be pressed down to act as an additional button.
4. Grip buttons
  • Located on the sides of each controller.
  • Simulate gripping actions, such as holding onto interactables. Some apps use them for different purposes, like triggering an action.
5. Menu button
  • Located on the left controller.
  • This button is designed to open and close the menu of the active application, allowing access to settings, options, and other features specific to that app.
6. Meta button
  • Located on the right controller.
  • This button is used to open and close the Quest universal menu, allowing users to adjust settings, view notifications, and change or close apps.

Design

Learn about input mappings, grasp the design principles, take into account comfort factors, and explore essential dos and don’ts.

Interactions

Discover various input capabilities and interaction methods utilizing controllers as input modality:
Targeting
Target objects using controllers in two ways: directly by touching (colliding with) an interactable or indirectly using a ray cast.
Selection
Letting the user choose or activate that interactable with the controller by using for example the trigger button.
Trigger Buttons
Select, accept
Grip Buttons
Grab
Thumbstick
Menu navigation, Locomotion
X/A
Select, accept
Y/B
Back out, cancel
Meta Button
The Meta button allows you to open or close the Quest universal menu. It is an OS-level feature and is not customizable.
Menu Button
Bring up or dismiss a pause or in-app menu
These guidelines describe the recommended default behavior for most applications and situations. If there is a situation which doesn’t work well with these conventions, the application should choose whatever mapping makes the most sense.
For a comprehensive overview of all input modalities and their corresponding input mappings, please visit the following page: Input mappings

Design principles

Explore the fundamental concepts that shape intuitive and user-friendly interactions for controller input modalities.

6DOF controller best practices

The recommendations and best practices in this section are specific to 6DOF controllers, like: Meta Quest and Meta Touch controllers.
Input mapping
Input mapping involves assigning specific functions to different controls on a controller. To avoid forcing users to relearn controls or repeatedly look at their controllers, it's crucial to keep functions aligned with common practices to make the controllers become intuitive. For guidelines on mapping input functions to maintain consistency across a wide array of Meta Quest titles, please refer to the input mappings page.
Multimodal
Multimodal interactions refer to providing multiple input modalities, such as controllers, hands, voice, and more, for the user to interact. By incorporating multiple modalities into an application, users can enjoy the experience by naturally choosing the interaction method that serves them best in the given moment. This creates a more seamless and enjoyable experience, as well as increased accessibility.
Accessibility
Users may be left-handed, right-handed, or ambidextrous. To accommodate all users, ensure that interactions can be performed with either hand when two controllers are available. Ideally, plan to support multimodal interactions, allowing for single-controller usage. Exercise caution with two-hand interactions. For further guidance, refer to the accessibility page.
Controller representation
  • Choose hand models that users can customize to increase comfort. Avoid overly realistic models that don't match the user's physical hands. Semi-transparent or robotic hands are generally more acceptable as they cater to a diverse range of users. See the hand representation page for more guidance.
  • Avoid animations that move the hands without user input, except for minimal, expected animations like gun recoil.
  • Implement object grabbing by snapping the object into the correct alignment when gripped.
Movement representation
Maintain a 1:1 ratio between the movement of the user’s controller in the physical world and the movement of the virtual representation. This can be rotational or translational movement in space. If exaggerating the user’s movement in fully immersive experiences, make it so exaggerated (For example: 4x) that it is readily obvious that it is not a natural sensory experience.
Haptic feedback
Incorporate haptic feedback to enhance the realism of interactions and confirm user actions. For more detailed guidance, refer to the haptics page.

Comfort

Correct hand

The hardware is ergonomically designed to ensure maximum comfort for the hands. To enhance user experience, visually indicate which controller is intended for the left or right hand. This is particularly beneficial in scenarios where controllers are set down and picked up frequently, ensuring users consistently select the correct controller for the appropriate hand.
Controllers comfort swapped arms

Next steps

More design resources on controllers

Designing experiences

  • Input Modalities: Discover all the various input modalities.
  • Hands: Examine hands-based input methods.
  • Head: Examine head-based input methods.
  • Voice: Learn how to design voice-enabled experiences.
  • Peripherals: Learn how to design experiences that leverage peripherals.

Core interaction types

  • Ray casting: Interact with objects at a distance.
  • Touch: Interact with objects directly through touch.
  • Grab: Hold, move, and manipulate virtual objects.