Controllers
Explore the use of controllers to improve user experiences in immersive experiences. This page delves into controllers as an input method, outlining their benefits, challenges, capabilities, best practices, operational mechanics, and design principles.
Controllers are handheld input devices that can be used to extend the input capabilities of hands.
Equipped with buttons, joysticks, and various other controls, these devices facilitate a wider range of interactions and allow for more precise interactions.
Here are the various components, key characteristics, and commonly used terms you should know:
1.
Trigger buttons
| - Located on the front of the controller.
- Primarily used for selection and triggering actions.
- They are pressure-sensitive, ensuring a responsive experience.
- Quest Pro Controllers feature additional touch sensitivity, offering touch motion interaction options.
|
2.
Y, X, A and B buttons
| - These are the face buttons on the controller.
- These buttons can be used for triggering different application/context actions, depending on the design.
|
3.
Thumbstick
| - Each controller features a thumbstick.
- It enables smooth manipulation, such as scrolling in a UI or navigating within a virtual environment.
- The thumbstick can also be pressed down to act as an additional button.
|
4.
Grip buttons
| - Located on the sides of each controller.
- Simulate gripping actions, such as holding onto interactables. Some apps use them for different purposes, like triggering an action.
|
5.
Menu button
| - Located on the left controller.
- This button is designed to open and close the menu of the active application, allowing access to settings, options, and other features specific to that app.
|
6.
Meta button
| - Located on the right controller.
- This button is used to open and close the Quest universal menu, allowing users to adjust settings, view notifications, and change or close apps.
|
This section offers insights into the technology behind controllers, focusing on their functionality, accuracy, calibration, and inherent technological limitations. We also discuss strategies to mitigate these limitations, enhancing the design of effective interactions.
Meta Quest controllers are equipped with a variety of sensors, including an accelerometer and gyroscope, which continuously monitor their movement, orientation, and position. Additional sensors such as cameras and an array of infrared LEDs enhance the precision of tracking in 3D space. The Meta Quest headset’s (HMD) own sensors also play a crucial role in accurately determining the controllers’ location and movements. In addition to spatial tracking, the controllers feature buttons, triggers, and thumbsticks that enable users to interact with virtual content in diverse ways. These inputs are wirelessly transmitted to the Meta Quest headset via wireless transmission and are processed to produce corresponding actions.
Various factors can influence the accuracy of the controllers, including environment variability. For further details on these challenges and the strategies used to mitigate them, refer to the section below on
limitations & mitigations. Understanding these factors is crucial for designing effective and immersive user experiences.
The technology behind the controllers has evolved over time, resulting in different capabilities and best practices. Our controllers can be categorized so far into three main groups:
3DOF controllers
These controllers have 3 Degrees of Freedom (3DOF), which allows for orientation tracking but not positional tracking (rotation, pitch, and yaw). This allows users to interact with the virtual environment through pointing, selecting, and manipulating objects in a limited manner, suitable for simpler or seated fully immersive experiences where spatial interactions are not required.
These controllers shipped with the mobile HMD, such as Gear VR and Go. They support only one controller because the HMD is unable to differentiate between multiple controllers.
6DOF controllers
These controllers have 6 degrees of freedom (6DOF), enabling both orientation and positional tracking (rotation, pitch, yaw, and movement along the x, y, and z axes). This allows controllers to function as virtual hands, interacting spatially with the virtual world.
These controllers shipped with the PC-based HMD, such as Rift.
Self-tracked controllers
Self-tracked controllers are a type of 6DOF controller that uses onboard sensors to track their position and orientation in 3D space. They do not require external sensors or cameras to operate, making them more convenient and flexible than other types of controllers.
These controllers shipped with the standalone HMD, such as Oculus Quest, and offer a more immersive and interactive experience for users.
Controller tracking can deliver sub-millimeter precision, through the use of both classical computer vision and machine learning models. Machine perception cameras identify and triangulate the infrared LEDs on the controller providing sub-pixel accuracy, while machine learning models are trained to estimate the pose of the controller when LEDs are occluded and in difficult lighting conditions. These systems leverage each other to maximize controller performance.
Limitations and mitigations
Every technology comes with its own set of limitations and challenges that must be addressed in order to ensure optimal performance and usability. In this section, we’ll delve into these aspects, discussing mitigation strategies through design or code. Let’s get started:
For Meta Quest 1/2/3 controllers, there is a fundamental limitation based on the tracking volume defined by the headset cameras. Primarily, the headset needs to see some portion of the controllers in order to estimate their pose, especially when starting tracking.
Mitigation - Design: Controllers utilizing an inertial measurement unit (IMU) which can be used to estimate position and orientation for a short period of time when the controller is out of the FOV of the tracking cameras, however this will drift over a short period of time, suffering a rapid degradation in accuracy. Self-tracked controllers do not suffer from this limitation.
Mitigation - Code: For use cases which are known to stress the position/orientation of the controller with respect to the tracking volume, several physics based constraint systems are utilized to assist with controller tracking.
While controllers are fairly robust to a wide variety of lighting conditions, there can be scenarios where specific conditions will cause degradation in tracking quality. For Quest 1/2/3 controllers, the most impactful conditions are high intensity lighting or direct sunlight. This is primarily due to the wash-out of the infrared LEDs by the high intensity light emitted by the sun, however particularly bright indoor lighting can also wash out the LEDs.
Mitigation - Design: Our devices adapt to varying light conditions, with continuous improvements through updates and new models. To ensure optimal functionality, we recommend advising users to be mindful of their environment’s lighting, for example through a note at application launch.
In this section, we provide guidance on using controllers for interaction. Learn about input primitives, grasp the design principles, take into account comfort factors, and explore essential dos and don’ts.
Discover various input capabilities and interaction methods utilizing the HMD as input modality:
Targeting | Target objects using controllers in two ways: directly by touching (colliding with) an interactable or indirectly using a ray cast. |
Selection | Letting the user choose or activate that interactable with the controller by using for example the trigger button. |
Trigger Buttons | Select, accept, etc. |
Grip Buttons | Grab, etc. |
Thumbstick | Menu navigation, Locomotion, etc. |
X/A | Select, accept, etc. |
Y/B | Back out, cancel, etc. |
Meta Button | The Meta button allows you to open or close the Quest universal menu. It is an OS-level feature and is not customizable. |
Menu Button | Bring up or dismiss a pause or in-app menu |
These guidelines describe the recommended default behavior for most applications and situations. If there is a situation which doesn’t work well with these conventions, the application should choose whatever mapping makes the most sense.
For a comprehensive overview of all input modalities and their corresponding Input Primitives, please visit the following page:
Input primitivesIn this section, we explore the fundamental concepts that shape intuitive and user-friendly interactions for controller input modalities.
3DOF and 6DOF controller comparison The main design limitation of 3DoF controllers compared to 6DoF controllers is that 3DoF only allows for orientation tracking (rotation, pitch, and yaw) without positional tracking, making them suitable for simpler or seated experiences. Whereas 6DoF controllers enable both orientation and positional tracking (including movement along the x, y, and z axes), allowing for more immersive and interactive spatial interactions.
6DOF controller best practices The recommendations and best practices in this section are specific to 6DOF controllers, like: Meta Quest and Meta Touch controllers.
Input mapping
|
Input mapping involves assigning specific functions to different controls on a controller.
To avoid forcing users to relearn controls or repeatedly look at their controllers, it's crucial to keep functions aligned with common practices to make the c |