Virtual Keyboard Native Integration
Before you begin working with Virtual Keyboard, you need the following:
- A Meta Quest 2 or Pro headset with the latest version of the Meta Quest operating system.
- The latest version of the Oculus OpenXR Mobile SDK.
To check your version of the Meta Horizon OS:
- In the headset, go to Settings > System > Software Update.
- Check the version.
- If the version is not v54 or higher, update the software to the latest available version.
Follow the instructions
here to set up your project for OpenXR.
In the Android manifest, add the following features and permissions to unlock the essential functionality for using Virtual Keyboard.
<!-- Tell the system this app uses the virtual keyboard extensions -->
<uses-feature android:name="com.oculus.feature.VIRTUAL_KEYBOARD" android:required="false" />
<!-- Tell the system this app can handle tracked remotes and hands -->
<uses-feature android:name="oculus.software.handtracking" android:required="false" />
<uses-permission android:name="com.oculus.permission.HAND_TRACKING" />
<!-- Tell the system this app uses render model extensions -->
<uses-feature android:name="com.oculus.feature.RENDER_MODEL" android:required="true" />
<uses-permission android:name="com.oculus.permission.RENDER_MODEL" />
Then, if you are using OpenXR 1.0.27 or before, you’ll need to include the extension header for Virtual Keyboard in your source code:
#include <openxr/meta_virtual_keyboard.h>
Creating an Instance and Session
The following extension names are required to use Virtual Keyboard:
- Virtual Keyboard:
XR_META_VIRTUAL_KEYBOARD_EXTENSION_NAME
- Render Model:
XR_FB_RENDER_MODEL_EXTENSION_NAME
- Hands (optional):
XR_EXT_HAND_TRACKING_EXTENSION_NAME
XR_FB_HAND_TRACKING_MESH_EXTENSION_NAME
XR_FB_HAND_TRACKING_AIM_EXTENSION_NAME
To access the OpenXR extension API, you must call xrGetInstanceProcAddr
to obtain pointers to the exported functions. Call the following to obtain pointers to the Virtual Keyboard extension functions:
OXR(xrGetInstanceProcAddr(
instance_,
"xrCreateVirtualKeyboardMETA",
(PFN_xrVoidFunction*)(&xrCreateVirtualKeyboardMETA_)));
OXR(xrGetInstanceProcAddr(
instance_,
"xrDestroyVirtualKeyboardMETA",
(PFN_xrVoidFunction*)(&xrDestroyVirtualKeyboardMETA_)));
OXR(xrGetInstanceProcAddr(
instance_,
"xrCreateVirtualKeyboardSpaceMETA",
(PFN_xrVoidFunction*)(&xrCreateVirtualKeyboardSpaceMETA_)));
OXR(xrGetInstanceProcAddr(
instance_,
"xrSuggestVirtualKeyboardLocationMETA",
(PFN_xrVoidFunction*)(&xrSuggestVirtualKeyboardLocationMETA_)));
OXR(xrGetInstanceProcAddr(
instance_,
"xrGetVirtualKeyboardScaleMETA",
(PFN_xrVoidFunction*)(&xrGetVirtualKeyboardScaleMETA_)));
OXR(xrGetInstanceProcAddr(
instance_,
"xrSetVirtualKeyboardModelVisibilityMETA",
(PFN_xrVoidFunction*)(&xrSetVirtualKeyboardModelVisibilityMETA_)));
OXR(xrGetInstanceProcAddr(
instance_,
"xrGetVirtualKeyboardModelAnimationStatesMETA",
(PFN_xrVoidFunction*)(&xrGetVirtualKeyboardModelAnimationStatesMETA_)));
OXR(xrGetInstanceProcAddr(
instance_,
"xrGetVirtualKeyboardDirtyTexturesMETA",
(PFN_xrVoidFunction*)(&xrGetVirtualKeyboardDirtyTexturesMETA_)));
OXR(xrGetInstanceProcAddr(
instance_,
"xrGetVirtualKeyboardTextureDataMETA",
(PFN_xrVoidFunction*)(&xrGetVirtualKeyboardTextureDataMETA_)));
OXR(xrGetInstanceProcAddr(
instance_,
"xrSendVirtualKeyboardInputMETA",
(PFN_xrVoidFunction*)(&xrSendVirtualKeyboardInputMETA_)));
OXR(xrGetInstanceProcAddr(
instance_,
"xrChangeVirtualKeyboardTextContextMETA",
(PFN_xrVoidFunction*)(&xrChangeVirtualKeyboardTextContext_)));
Similarly, for the Render Model extension, you will need:
OXR(xrGetInstanceProcAddr(
instance_,
"xrEnumerateRenderModelPathsFB",
(PFN_xrVoidFunction*)(&xrEnumerateRenderModelPathsFB_)));
OXR(xrGetInstanceProcAddr(
instance_,
"xrGetRenderModelPropertiesFB",
(PFN_xrVoidFunction*)(&xrGetRenderModelPropertiesFB_)));
OXR(xrGetInstanceProcAddr(
instance_,
"xrLoadRenderModelFB",
(PFN_xrVoidFunction*)(&xrLoadRenderModelFB_)));
Finally, for Hands:
OXR(xrGetInstanceProcAddr(
instance_,
"xrCreateHandTrackerEXT",
(PFN_xrVoidFunction*)(&xrCreateHandTrackerEXT_)));
OXR(xrGetInstanceProcAddr(
instance_,
"xrDestroyHandTrackerEXT",
(PFN_xrVoidFunction*)(&xrDestroyHandTrackerEXT_)));
OXR(xrGetInstanceProcAddr(
instance_,
"xrLocateHandJointsEXT",
(PFN_xrVoidFunction*)(&xrLocateHandJointsEXT_)));
OXR(xrGetInstanceProcAddr(
instance_,
"xrGetHandMeshFB",
(PFN_xrVoidFunction*)(&xrGetHandMeshFB_)));
You can initialize and manage these extensions however you wish. We recommend breaking them apart into helper classes that can own their individual responsibilities, see XrVirtualKeyboardHelper
, XrHandHelper
, and XrRenderModelHelper
in the sample project for details.
Before using Virtual Keyboard, first you need to check if the feature is supported on your device. You can do this by calling xrGetSystemProperties
and passing XrSystemVirtualKeyboardPropertiesMETA
via the XrSystemProperties::next
pointer. If supportsVirtualKeyboard
is XR_TRUE
then you are ready to use Virtual Keyboard.
bool IsVirtualKeyboardSupported() {
XrSystemProperties systemProperties{XR_TYPE_SYSTEM_PROPERTIES};
XrSystemVirtualKeyboardPropertiesMETA virtualKeyboardProps{
XR_TYPE_SYSTEM_VIRTUAL_KEYBOARD_PROPERTIES_META};
systemProperties.next = &virtualKeyboardProps;
OXR(xrGetSystemProperties(instance_, systemId_, &systemProperties));
return virtualKeyboardProps.supportsVirtualKeyboard == XR_TRUE;
}
Create Keyboard and Space
Next you need to create a keyboard instance and obtain an XrVirtualKeyboardMETA
handle. This handle allows you to reference the keyboard for subsequent operations such as handling input and getting state updates.
XrVirtualKeyboardCreateInfoMETA createInfo{XR_TYPE_VIRTUAL_KEYBOARD_CREATE_INFO_META};
OXR(xrCreateVirtualKeyboardMETA_(session_, &createInfo, &keyboardHandle_));
You will also need to create an XrSpace
so you can locate the keyboard later. Optionally, you can set the starting location of where the keyboard should be placed. The locationType
parameter controls whether to use a custom location that you specify, or use the default as recommended by the runtime based on the input mode. Please refer to the XrVirtualKeyboardLocationTypeMETA
enum in the spec doc for more information.
The following is an example of how to specify a custom location.
XrVirtualKeyboardSpaceCreateInfoMETA spaceCreateInfo{ XR_TYPE_VIRTUAL_KEYBOARD_SPACE_CREATE_INFO_META};
spaceCreateInfo.locationType = XR_VIRTUAL_KEYBOARD_LOCATION_TYPE_CUSTOM_META;
spaceCreateInfo.space = GetLocalSpace();
spaceCreateInfo.poseInSpace = ToXrPosef(OVR::Posef::Identity());
OXR(xrCreateVirtualKeyboardSpaceMETA_(session_, keyboardHandle_, &spaceCreateInfo, &keyboardSpace_));
Load Keyboard Render Model
To load the Virtual Keyboard render model, you must first get the corresponding model key by enumerating all available render model paths. This involves making a few calls to retrieve the paths and their properties as shown in the example below. The model key we need is for the path “/model_meta/keyboard/virtual”.
XrRenderModelKeyFB GetVirtualKeyboardModelKey() {
XrRenderModelKeyFB modelKey = XR_NULL_RENDER_MODEL_KEY_FB;
uint32_t pathCount = 0;
OXR(xrEnumerateRenderModelPathsFB_(session_, pathCount, &pathCount, nullptr));
std::vector<XrRenderModelPathInfoFB> pathInfos(
pathCount, {XR_TYPE_RENDER_MODEL_PATH_INFO_FB});
OXR(xrEnumerateRenderModelPathsFB_(session_, pathCount, &pathCount, pathInfos.data()));
for (const auto& info : pathInfos) {
char pathString[XR_MAX_PATH_LENGTH];
uint32_t countOutput = 0;
xrPathToString(instance_, info.path, XR_MAX_PATH_LENGTH, &countOutput, pathString);
XrRenderModelPropertiesFB prop{XR_TYPE_RENDER_MODEL_PROPERTIES_FB};
XrRenderModelCapabilitiesRequestFB capReq{XR_TYPE_RENDER_MODEL_CAPABILITIES_REQUEST_FB};
capReq.flags = XR_RENDER_MODEL_SUPPORTS_GLTF_2_0_SUBSET_2_BIT_FB;
prop.next = &capReq;
OXR(xrGetRenderModelPropertiesFB_(session_, info.path, &prop));
if (strcmp(pathString, "/model_meta/keyboard/virtual") == 0) {
modelKey = prop.modelKey;
break;
}
}
return modelKey;
}
Note that the Virtual Keyboard model requires XR_RENDER_MODEL_SUPPORTS_GLTF_2_0_SUBSET_2_BIT_FB
support as it contains multiple meshes and textures with transparency.
Once the render model key is obtained, you can proceed to load the actual model data. Below is an example on how to do this.
std::vector<uint8_t> LoadVirtualKeyboardModel(XrRenderModelKeyFB modelKey) {
XrRenderModelLoadInfoFB loadInfo = {XR_TYPE_RENDER_MODEL_LOAD_INFO_FB};
loadInfo.modelKey = modelKey;
XrRenderModelBufferFB renderModelBuffer{XR_TYPE_RENDER_MODEL_BUFFER_FB};
OXR(xrLoadRenderModelFB_(session_, &loadInfo, &renderModelBuffer));
std::vector<uint8_t> buffer(renderModelBuffer.bufferCountOutput);
renderModelBuffer.buffer = buffer.data();
renderModelBuffer.bufferCapacityInput = renderModelBuffer.bufferCountOutput;
OXR(xrLoadRenderModelFB_(session_, &loadInfo, &renderModelBuffer));
return buffer;
}
If the above code runs successfully, you will have a data buffer containing the raw render model data for Virtual Keyboard. This raw data will be in glTF binary format (i.e. extension *.glb
), and you must parse it before using it as shown in the following section.
As mentioned above, the model data will be in *.glb
format. Therefore, you must first parse the data before you can render the keyboard model. If you are using the SampleXrFramework provided by the SDK, you can do so using the following function:
keyboardModel = LoadModelFile_glB(
"keyboard", (const char*)buffer.data(), buffer.size(), programs, materials);
where programs
and materials
are arguments that allow you to control how the model will be rendered.
Custom Image URI Handling The glTF render model for Virtual Keyboard uses a custom image URI for textures that the application needs to update dynamically. The runtime may use this to apply changes to character glyphs, provide suggestion words for typeahead and swipe typing, and show visual effects such as the swipe typing trail.
This requires the application to implement a custom URI handler when parsing the glTF model data and create writable textures that can be later referenced by a texture ID.
The custom image URI will have the following format:
metaVirtualKeyboard://texture/{textureID}?w={width}&h={height}&fmt=RGBA32
Here is an example of how to add a custom handler with the SampleXrFramework provided by the SDK:
OVRFW::MaterialParms materials = {};
materials.ImageUriHandler = [this](OVRFW::ModelFile& modelFile, const std::string& uri) {
uint64_t textureId;
uint32_t pixelWidth;
uint32_t pixelHeight;
if (!ParseImageUri(uri, textureId, pixelWidth, pixelHeight)) {
return false;
}
// Add texture to our model
OVRFW::ModelTexture tex;
tex.name = std::to_string(textureId);
tex.texid = CreateGlTexture(pixelWidth, pixelHeight);
modelFile.Textures.push_back(tex);
// Register texture
textureIdMap_[textureId] = tex.texid;
ALOG("Registered texture %d, %ux%u", (int)textureId, pixelWidth, pixelHeight);
return true;
};
where materials
is the argument passed into LoadModelFile_glB
. Here is the code that handles the actual URI parsing:
bool ParseImageUri(
const std::string& uri,
uint64_t& textureId,
uint32_t& pixelWidth,
uint32_t& pixelHeight) {
// URI format:
// metaVirtualKeyboard://texture/{textureID}?w={width}&h={height}&fmt=RGBA32
auto getToken = [&uri](size_t startIdx, char delimiter, std::string& token) {
const size_t endIdx = uri.find_first_of(delimiter, startIdx);
if (endIdx == std::string::npos) {
return false;
}
token = uri.substr(startIdx, endIdx - startIdx);
return true;
};
// Validate scheme
std::string token;
size_t index = 0;
if (!getToken(index, ':', token) || token != "metaVirtualKeyboard") {
return false;
}
// Validate resource type
index += token.size() + 3; // skip "://"
if (!getToken(index, '/', token) || token != "texture") {
return false;
}
// Get texture id
index += token.size() + 1; // skip "/"
if (!getToken(index, '?', token)) {
return false;
}
textureId = std::stoull(token);
// Get pixel width
index += token.size() + 3; // skip "?w="
if (!getToken(index, '&', token)) {
return false;
}
pixelWidth = std::stoul(token);
// Get pixel height
index += token.size() + 3; // skip "&h="
if (!getToken(index, '&', token)) {
return false;
}
pixelHeight = std::stoul(token);
// Validate format
index += token.size();
if (uri.substr(index) != "&fmt=RGBA32") {
return false;
}
return true;
}
Support Additive Morph Target Animations Furthermore, the runtime utilizes additive morph target animations to control the mesh vertex positions and UVs. This is needed so that dynamically generated textures with suggestion words of arbitrary lengths can be properly mapped onto the keyboard model keys.
For this, the glTF model uses the “extras” property on the
animation channel and writes an integer value named
additiveWeightIndex
. This value indicates which morph target weight the animation should be additively applied to. If the value for
additiveWeightIndex
is -1, then the animation should instead update all the weights.
For the Virtual Keyboard model, each key has a quad with 4 vertices where we map a rectangular region of our texture atlas to. For an arbitrary word on the atlas, the XY coordinate of the quad’s vertex positions and as well as the UVs need to be adjusted. This requires 16 morph target animations applied additively in order to control each value separately.
Even though the runtime will handle any user interaction with the keyboard based on the input sent by the application, the application is responsible for managing how the keyboard should collide with other objects in the scene. To support this, the glTF model contains a node named “collision” that the application can look up. The mesh geometry and bound for this node can be used to define colliders within the application’s choice of physics system.
See VirtualKeyboardModelRenderer
in the sample project for an example of how this is used to handle collision with the UI.
When the user wants to use the keyboard, the application can show it by calling xrSetVirtualKeyboardModelVisibilityMETA
. This will request the runtime to update the model visibility. However, the application should wait until the XrEventDataVirtualKeyboardShownMETA
event is triggered which indicates that the runtime has acknowledged the request and is ready to handle input.
Here is how to update the keyboard visibility:
XrVirtualKeyboardModelVisibilitySetInfoMETA modelVisibility{XR_TYPE_VIRTUAL_KEYBOARD_MODEL_VISIBILITY_SET_INFO_META};
modelVisibility.visible = XR_TRUE;
OXR(xrSetVirtualKeyboardModelVisibilityMETA_(keyboardHandle_, &modelVisibility));
Set Keyboard Pose and Scale
The application can request the runtime to change the keyboard pose and scale by calling xrSuggestVirtualKeyboardLocationMETA
. Similar to xrCreateVirtualKeyboardSpaceMETA
, you can either use a custom pose and scale values, or use the runtime recommended defaults by specifying the location type.
To reset the keyboard to the default recommended location for direct typing, you can do:
XrVirtualKeyboardLocationInfoMETA locationInfo{XR_TYPE_VIRTUAL_KEYBOARD_LOCATION_INFO_META};
locationInfo.locationType = XR_VIRTUAL_KEYBOARD_LOCATION_TYPE_DIRECT_META;
locationInfo.space = GetLocalSpace();
OXR(xrSuggestVirtualKeyboardLocationMETA_(keyboardHandle_, &locationInfo));
To move the keyboard to a specific location and scale, you can do:
XrVirtualKeyboardLocationInfoMETA locationInfo{XR_TYPE_VIRTUAL_KEYBOARD_LOCATION_INFO_META};
locationInfo.locationType = XR_VIRTUAL_KEYBOARD_LOCATION_TYPE_CUSTOM_META;
locationInfo.space = GetLocalSpace();
locationInfo.poseInSpace = <newPose>;
locationInfo.scale = <newScale>;
OXR(xrSuggestVirtualKeyboardLocationMETA_(keyboardHandle_, &locationInfo));
Note that after calling xrSuggestVirtualKeyboardLocationMETA
, the application should get the actual pose and scale from the runtime as the runtime has final control on where the keyboard can be.
To get the keyboard pose and scale, you can use xrLocateSpace
and xrGetVirtualKeyboardScaleMETA
as follows:
XrSpaceLocation location{XR_TYPE_SPACE_LOCATION};
OXR(xrLocateSpace(keyboardSpace_, GetLocalSpace(), ToXrTime(OVRFW::GetTimeInSeconds()), &location));
float scale = 1.0f;
OXR(xrGetVirtualKeyboardScaleMETA_(keyboardHandle_, &scale));
currentPose_ = FromXrPosef(location.pose);
currentScale_ = scale;
To aid features such as typeahead prediction and whole word deletion, you should inform the keyboard with the current text string that is being edited by calling xrChangeVirtualKeyboardTextContextMETA
. Here is an example of how you can use this function:
XrVirtualKeyboardTextContextChangeInfoMETA changeInfo{XR_TYPE_VIRTUAL_KEYBOARD_TEXT_CONTEXT_CHANGE_INFO_META};
changeInfo.textContext = textString.c_str();
OXR(xrChangeVirtualKeyboardTextContextMETA(keyboardHandle_, &changeInfo));
We recommend that you call this each time the input focus is changed to a different text input field.
Virtual Keyboard supports both raycast and direct touch input modalities. To feed user interaction data for keyboard input processing, you should call xrSendVirtualKeyboardInputMETA
for each active input device (i.e. hand or controller) during the application frame update.
To specify the type of input for each device, use the enum field XrVirtualKeyboardInputSourceMETA
. For example, to send a right controller raycast with the index trigger pressed, you can do:
XrVirtualKeyboardInputInfoMETA info{XR_TYPE_VIRTUAL_KEYBOARD_INPUT_INFO_META};
info.inputSource = XR_VIRTUAL_KEYBOARD_INPUT_SOURCE_CONTROLLER_RAY_RIGHT_META;
info.inputSpace = GetLocalSpace();
info.inputPoseInSpace = GetRightControllerAimPose();
info.inputState = XR_VIRTUAL_KEYBOARD_INPUT_STATE_PRESSED_BIT_META;
OXR(xrSendVirtualKeyboardInputMETA_(keyboardHandle_, &info, &interactorRootPose));
To send a left hand direct touch with the index finger, you can do:
XrVirtualKeyboardInputInfoMETA info{XR_TYPE_VIRTUAL_KEYBOARD_INPUT_INFO_META};
info.inputSource = XR_VIRTUAL_KEYBOARD_INPUT_SOURCE_HAND_DIRECT_INDEX_TIP_LEFT_META;
info.inputSpace = GetLocalSpace();
info.inputPoseInSpace = GetLeftHandJointPose(XR_HAND_JOINT_INDEX_TIP_EXT);
OXR(xrSendVirtualKeyboardInputMETA_(keyboardHandle_, &info, &interactorRootPose));
The application can decide whether it should send raycast or touch input for the keyboard to process. If both raycast and direct input types are sent, the runtime may decide which one is the most appropriate to use.
Note that in the above examples, the value of each input device’s root pose is sent via interactorRootPose
. This is an in-out parameter and the runtime may modify the value for touch limiting, which essentially means that the runtime is suggesting where the input device should be repositioned as a visual feedback in response to the touch input being acknowledged. If touch limiting is desired, you should reposition the input render models with the modified root pose.
After the runtime finishes processing the input sent by the application, it may generate new render model animations and texture updates that you should retrieve to reflect the current state of the keyboard.
During each frame you should call xrGetVirtualKeyboardDirtyTexturesMETA
to check for any keyboard render model textures that need to be updated. The runtime may use this to apply changes to character glyphs, provide suggestion words for typeahead and swipe typing, and show visual effects such as the swipe typing trail. The function will return a list of texture IDs which you should iterate through and call xrGetVirtualKeyboardTextureDataMETA
to get the actual texture pixel data. You should then apply the data to the corresponding textures based on the IDs you have cached from parsing the glTF image URIs.
Here is an example of how to get the texture data:
uint32_t textureIDCount = 0;
OXR(xrGetVirtualKeyboardDirtyTexturesMETA_(keyboardHandle_, 0, &textureIDCount, nullptr));
if (textureIDCount > 0) {
std::vector<uint64_t> textureIDs(textureIDCount);
OXR(xrGetVirtualKeyboardDirtyTexturesMETA_(
keyboardHandle_,
textureIDCount,
&textureIDCount,
textureIDs.data()));
for (const uint64_t textureID : textureIDs) {
XrVirtualKeyboardTextureDataMETA textureData{XR_TYPE_VIRTUAL_KEYBOARD_TEXTURE_DATA_META};
OXR(xrGetVirtualKeyboardTextureDataMETA_(keyboardHandle_, textureID, &textureData));
std::vector<uint8_t> textureDataBuffer(textureData.bufferCountOutput);
textureData.bufferCapacityInput = textureData.bufferCountOutput;
textureData.buffer = textureDataBuffer.data();
OXR(xrGetVirtualKeyboardTextureDataMETA_(keyboardHandle_, textureID, &textureData));
// Apply `textureData` to the texture referenced by `textureID`
}
}
After applying texture updates, you should then get new animation states for the keyboard render model. The runtime uses animations to control different parts of the model including node visibility and geometry manipulation via additive morph targets.
Since a node’s geometry may be updated by multiple animations (e.g. separate updates to vertex positions and UVs), we recommend marking nodes that are updated dirty and only rebuild the surface geo after all animation states have been applied.
Here is an example for retrieving animation states for the render model:
XrVirtualKeyboardModelAnimationStatesMETA modelAnimationStates{XR_TYPE_VIRTUAL_KEYBOARD_MODEL_ANIMATION_STATES_META};
OXR(xrGetVirtualKeyboardModelAnimationStatesMETA_(keyboardHandle_, &modelAnimationStates));
if (modelAnimationStates.stateCountOutput > 0) {
std::vector<XrVirtualKeyboardAnimationStateMETA> animationStatesBuffer(
modelAnimationStates.stateCountOutput,
{XR_TYPE_VIRTUAL_KEYBOARD_ANIMATION_STATE_META});
modelAnimationStates.stateCapacityInput = modelAnimationStates.stateCountOutput;
modelAnimationStates.states = animationStatesBuffer.data();
OXR(xrGetVirtualKeyboardModelAnimationStatesMETA_(keyboardHandle_, &modelAnimationStates));
for (uint32_t i = 0; i < modelAnimationStates.stateCountOutput; ++i) {
const auto& animationState = modelAnimationStates.states[i];
// 1) Look up the model animation indexed by `animationState.animationIndex`
// 2) Set the animation timeline to `animationState.fraction`
// 3) If the animation is a morph target, mark the node geo dirty
}
// Rebuild dirty geo here
}
The Virtual Keyboard runtime may trigger events based on the keyboard input and other backend services. You should add handlers to listen for the following events and react accordingly:
void HandleXrEvents() override {
XrEventDataBuffer eventDataBuffer{};
// Poll for events
for (;;) {
XrEventDataBaseHeader* baseEventHeader = (XrEventDataBaseHeader*)(&eventDataBuffer);
baseEventHeader->type = XR_TYPE_EVENT_DATA_BUFFER;
baseEventHeader->next = nullptr;
XrResult r = xrPollEvent(Instance, &eventDataBuffer);
if (r != XR_SUCCESS) {
break;
}
switch (baseEventHeader->type) {
case XR_TYPE_EVENT_DATA_VIRTUAL_KEYBOARD_COMMIT_TEXT_META: {
auto commitTextEvent =
(const XrEventDataVirtualKeyboardCommitTextMETA*)(baseEventHeader);
OnCommitText(commitTextEvent->text);
} break;
case XR_TYPE_EVENT_DATA_VIRTUAL_KEYBOARD_BACKSPACE_META: {
OnBackspace();
} break;
case XR_TYPE_EVENT_DATA_VIRTUAL_KEYBOARD_ENTER_META: {
OnEnter();
} break;
case XR_TYPE_EVENT_DATA_VIRTUAL_KEYBOARD_SHOWN_META: {
OnKeyboardShown();
} break;
case XR_TYPE_EVENT_DATA_VIRTUAL_KEYBOARD_HIDDEN_META: {
OnKeyboardHidden();
} break;
…
}
}
}
The XrEventDataVirtualKeyboardCommitTextMETA
event is sent when the runtime determines that a character or word is entered. The application should append the result to the string corresponding to the focused input field.
The XrEventDataVirtualKeyboardBackspaceMETA
event is sent when the Backspace key is pressed on the keyboard. The application should remove a single character from the string corresponding to the focused input field if it’s not empty. Note that long pressing the backspace key will repeat this event.
The XrEventDataVirtualKeyboardEnterMETA
event is sent when the Enter key is pressed on the keyboard. The application can pipe this event to whatever is expecting to handle this.
The XrEventDataVirtualKeyboardShownMETA
and XrEventDataVirtualKeyboardHiddenMETA
events signal that the runtime has updated the visibility state of the keyboard render model via the animation system. The application should listen for these events to react to the change in keyboard visibility (e.g. show/hide UI elements).
Finally, when you are done with using the keyboard (e.g. application shutdown), you should clean up by calling xrDestroyVirtualKeyboardMETA
to release resources referenced by the keyboard handle.
OXR(xrDestroyVirtualKeyboardMETA_(keyboardHandle_));
keyboardHandle_ = XR_NULL_HANDLE;