Virtual Keyboard Overview
Virtual Keyboard enables VR developers to easily integrate a best-in-class keyboard into their applications and provides users with a consistent typing experience across Meta Quest VR applications. The API is designed so that features and improvements can be made to the keyboard through OS updates without any modifications by the developer, allowing them to leverage the latest features.
The keyboard supports multiple input modes, including far raycast-based input, direct touch input, and swipe typing for both modalities.
While developers have full control over the position and scale of the keyboard, the extension provides default sizes and positions that have been fine-tuned for usability based on user research.
There are several components that you must set up to provide users with a rich text input experience. These components are:
- Virtual Keyboard
- Render Model
- Hands (optional)
The Virtual Keyboard is the main component that manages a keyboard in virtual space. This component allows the user to position and locate the keyboard, processes user interaction input, and provides visual and state updates back to the application.
The Render Model component provides the actual keyboard model to be rendered in virtual space. Note that while the application is running, new animation and texture data may be sent by the Virtual Keyboard runtime to reflect the current state of the keyboard model.
Though optional, we recommend enabling the Hands component, as it assists in tracking and displaying hand models in virtual space that correspond to the user’s actual hands, providing a more natural near-field typing experience.
For a complete example of how you can integrate Virtual Keyboard in your application, please see the sample project under
\XrSamples\XrVirtualKeyboard\ in the
Mobile OpenXR SDK.