Creating Actions, Action Sets, and Suggested Bindings
This topic discusses details of actions, action sets, and suggested bindings in OpenXR Native development. Familiarize yourself with the concepts outlined in the
Input API topic before proceeding.
In OpenXR, interaction profiles are representations of a specific piece of hardware, which can be a physical input device. These profiles contain paths to identify hardware plus a list of paths for input components that apps can bind actions against. The
OpenXR specification lists these as a collection of paths to set up
action bindings.
The app suggests action bindings against any profile. The listed profile represents the devices that the app was built for and tested with. If the app runs on a system with an input device without suggested bindings, the runtime can run the app in compatibility mode for any interaction profile with suggested bindings.
A special exception to the rule applies when interaction profiles refer to concrete hardware, such as /interaction_profiles/khr/simple_controller
. Tracked hand controllers widely support this, and it’s used in simple apps with few input buttons or as a fallback in complex apps.
For example, the Khronos Group’s
hello_xr sample app:
- Lists that it supports a simple controller based on the core (KHR) interaction profile, for example, input that relates to a simple button click.
- Specifies different input against different controllers, for example, if the user uses a Meta Quest 2 device and squeezes the Grip button (returning a float value, rather than a boolean).
- Keeps suggesting bindings against profiles based on other devices.
If the runtime uses input values from suggested bindings, it tries to bind these values to the action, making bindings behave similarly or as close as possible across devices. If this isn’t feasible, input values do not update the action state. f the app supports multiple suggested bindings for an interaction profile, the runtime keeps the last successful suggested binding for that profile, as indicated by an XR_SUCCESS
return value.
Details on how this mechanism works in the hello_xr sample app follow.
Converting the string form of action paths for the components to bind against XrPath
is essential. Paths return as XrPath
handles through calling the xrStringToPath
function.
std::array<XrPath, Side::COUNT> selectPath;
std::array<XrPath, Side::COUNT> squeezeValuePath;
...
CHECK_XRCMD(xrStringToPath(m_instance, "/user/hand/left/input/select/click", &selectPath[Side::LEFT]));
CHECK_XRCMD(xrStringToPath(m_instance, "/user/hand/right/input/select/click", &selectPath[Side::RIGHT]));
CHECK_XRCMD(xrStringToPath(m_instance, "/user/hand/left/input/squeeze/value", &squeezeValuePath[Side::LEFT]));
CHECK_XRCMD(xrStringToPath(m_instance, "/user/hand/right/input/squeeze/value", &squeezeValuePath[Side::RIGHT]));
...
For all path definitions that relate to the Meta Quest Touch Controller Profile, see the relevant section below.
The app suggests action bindings against a simple (KHR) controller profile by calling xrSuggestInteractionProfileBindings
. This default controller setting is expected to be successful on many devices that support OpenXR, yet it is still a suggestion or hint to the runtime. For example, here are the suggested bindings for some actions.
// Suggest bindings for KHR Simple.
{
XrPath khrSimpleInteractionProfilePath;
CHECK_XRCMD(
xrStringToPath(m_instance, "/interaction_profiles/khr/simple_controller", &khrSimpleInteractionProfilePath));
std::vector<XrActionSuggestedBinding> bindings{ {m_input.grabAction, selectPath[Side::LEFT]},
{m_input.grabAction, selectPath[Side::RIGHT]},
{m_input.poseAction, posePath[Side::LEFT]},
{m_input.poseAction, posePath[Side::RIGHT]},
{m_input.quitAction, menuClickPath[Side::LEFT]},
{m_input.quitAction, menuClickPath[Side::RIGHT]},
{m_input.vibrateAction, hapticPath[Side::LEFT]},
{m_input.vibrateAction, hapticPath[Side::RIGHT]} };
XrInteractionProfileSuggestedBinding suggestedBindings{XR_TYPE_INTERACTION_PROFILE_SUGGESTED_BINDING};
suggestedBindings.interactionProfile = khrSimpleInteractionProfilePath;
suggestedBindings.suggestedBindings = bindings.data();
suggestedBindings.countSuggestedBindings = (uint32_t)bindings.size();
CHECK_XRCMD(xrSuggestInteractionProfileBindings(m_instance, &suggestedBindings));
}
The hello_xr app defines the namespace Side
as (implementation-specific):
namespace Side {
const int LEFT = 0;
const int RIGHT = 1;
const int COUNT = 2;
} // namespace Side
The m_input
is a struct of type InputState
which helps store all action states (implementation-specific), defined initially in the app as:
struct InputState {
XrActionSet actionSet{XR_NULL_HANDLE};
XrAction grabAction{XR_NULL_HANDLE};
...
};
Suggested action bindings against Meta Quest controllers follow. This helps the runtime check whether it is feasible for the system to use them.
// Suggest bindings for the Meta Quest Touch.
{
XrPath oculusTouchInteractionProfilePath;
CHECK_XRCMD(
xrStringToPath(m_instance, "/interaction_profiles/oculus/touch_controller", &oculusTouchInteractionProfilePath));
std::vector<XrActionSuggestedBinding> bindings{ {m_input.grabAction, squeezeValuePath[Side::LEFT]},
{m_input.grabAction, squeezeValuePath[Side::RIGHT]},
{m_input.poseAction, posePath[Side::LEFT]},
{m_input.poseAction, posePath[Side::RIGHT]},
{m_input.quitAction, menuClickPath[Side::LEFT]},
{m_input.vibrateAction, hapticPath[Side::LEFT]},
{m_input.vibrateAction, hapticPath[Side::RIGHT]} };
XrInteractionProfileSuggestedBinding suggestedBindings{XR_TYPE_INTERACTION_PROFILE_SUGGESTED_BINDING};
suggestedBindings.interactionProfile = oculusTouchInteractionProfilePath;
suggestedBindings.suggestedBindings = bindings.data();
suggestedBindings.countSuggestedBindings = (uint32_t)bindings.size();
CHECK_XRCMD(xrSuggestInteractionProfileBindings(m_instance, &suggestedBindings))
}
This app is built for and tested with many devices from other vendors, so more suggested bindings follow sequentially.
The runtime can use any of the interaction profiles with suggested bindings. In practice, if the app suggests bindings for an interaction profile that refers to the exact hardware of the user, then that interaction profile will almost always be picked.
The following code initially defines an action set (actionSet
) and an action (grabAction
). It initially sets these to XR_NULL_HANDLE
, which is a value expected to change later if everything runs smoothly.
XrActionSet actionSet{XR_NULL_HANDLE};
XrAction grabAction{XR_NULL_HANDLE};
Here is how the xrCreateActionSet
function creates the action set gameplay
in the hello_xr sample app.
// Create an action set.
{
XrActionSetCreateInfo actionSetInfo{XR_TYPE_ACTION_SET_CREATE_INFO};
strcpy_s(actionSetInfo.actionSetName, "gameplay");
strcpy_s(actionSetInfo.localizedActionSetName, "Gameplay");
actionSetInfo.priority = 0;
CHECK_XRCMD(xrCreateActionSet(m_instance, &actionSetInfo, &m_input.actionSet));
}
XrActionCreateInfo
is a struct containing information such as name and description of the action set.
The localizedActionSetName
value might be shown to the user in a system rebinding menu, and actionSetName
might be stored by the system in a configuration file. This is the reason for these values to exist, although hello_xr sample app doesn’t use these.
The app then retrieves the XrPath
for two hands.
// Get the XrPath for the left and right hands - we will use them as subaction paths.
CHECK_XRCMD(xrStringToPath(m_instance, "/user/hand/left", &m_input.handSubactionPath[Side::LEFT]));
CHECK_XRCMD(xrStringToPath(m_instance, "/user/hand/right", &m_input.handSubactionPath[Side::RIGHT]));
Action Creation and Action State
You must create every app-specific action. For example, the following code creates the grab object
action for left and right hands through calling the xrCreateAction
function. This call again refers to a specific action set. Notice that either the left or right hand can perform this action:
// Create an input action for grabbing objects with the left and right hands.
XrActionCreateInfo actionInfo{XR_TYPE_ACTION_CREATE_INFO};
actionInfo.actionType = XR_ACTION_TYPE_FLOAT_INPUT;
strcpy_s(actionInfo.actionName, "grab_object");
strcpy_s(actionInfo.localizedActionName, "Grab Object");
actionInfo.countSubactionPaths = uint32_t(m_input.handSubactionPath.size());
actionInfo.subactionPaths = m_input.handSubactionPath.data();
CHECK_XRCMD(xrCreateAction(m_input.actionSet, &actionInfo, &m_input.grabAction));
To receive the action state through XrActionStateGetInfo
, the app uses a loop:
for (auto hand : {Side::LEFT, Side::RIGHT}) {
XrActionStateGetInfo getInfo{XR_TYPE_ACTION_STATE_GET_INFO};
getInfo.action = m_input.grabAction;
getInfo.subactionPath = m_input.handSubactionPath[hand];
...
}
The XrActionStateGetInfo
struct provides action paths when calling the xrGetActionState
function.
This is how receiving state in float while polling for the grabAction
state happens:
for (auto hand : {Side::LEFT, Side::RIGHT}) {
XrActionStateGetInfo getInfo{XR_TYPE_ACTION_STATE_GET_INFO};
getInfo.action = m_input.grabAction;
getInfo.subactionPath = m_input.handSubactionPath[hand];
XrActionStateFloat grabValue{XR_TYPE_ACTION_STATE_FLOAT};
CHECK_XRCMD(xrGetActionStateFloat(m_session, &getInfo, &grabValue));
...
}
Apps create
XrSpace
handles based on pose actions. To define the position and orientation of the space origin within a reference space, apps can provide an
XrPosef struct, which represents a position and orientation within the space.
Note: The app supplies a reference space every time it asks for the location of a space.
The first step is to create the space by calling the xrCreateActionSpace
function. Attaching action spaces to a session through the action sets occurs by calling the xrAttachSessionActionSets
function.
XrActionSpaceCreateInfo actionSpaceInfo{XR_TYPE_ACTION_SPACE_CREATE_INFO};
actionSpaceInfo.action = m_input.poseAction;
actionSpaceInfo.poseInActionSpace.orientation.w = 1.f;
actionSpaceInfo.subactionPath = m_input.handSubactionPath[Side::LEFT];
CHECK_XRCMD(xrCreateActionSpace(m_session, &actionSpaceInfo, &m_input.handSpace[Side::LEFT]));
actionSpaceInfo.subactionPath = m_input.handSubactionPath[Side::RIGHT];
CHECK_XRCMD(xrCreateActionSpace(m_session, &actionSpaceInfo, &m_input.handSpace[Side::RIGHT]));
XrSessionActionSetsAttachInfo attachInfo{XR_TYPE_SESSION_ACTION_SETS_ATTACH_INFO};
attachInfo.countActionSets = 1;
attachInfo.actionSets = &m_input.actionSet;
CHECK_XRCMD(xrAttachSessionActionSets(m_session, &attachInfo));
Important: Calling xrAttachSessionActionSets
is essential in order to bind the actions and action sets to the session, but it doesn’t really impact action spaces. However, you must call this function before being able to get any input. Creating action spaces is still possible after calling xrAttachSessionActionSets
.
Because action sets have instances as parent handles and spaces have the sessions as their parent, there is no direct link between action sets and space, so reference to the session is required.
Note: Action sets can turn on or off though the
xrSyncActions
function.
The input and haptics interaction profile for the Meta Quest Touch controller follows.
Paths for both /user/hand/left
and /user/hand/right
:
.../input/squeeze/value
.../input/trigger/value
.../input/trigger/touch
.../input/thumbstick/x
.../input/thumbstick/y
.../input/thumbstick/click
.../input/thumbstick/touch
.../input/thumbrest/touch
(not available on Quest 1 or Rift S, limited to Rift CV1 and Quest 2).../input/grip/pose
.../input/aim/pose
Hand specific:
Only on /user/hand/left | Only on /user/hand/right |
---|
.../input/x/click
| .../input/a/click
|
.../input/x/touch
| .../input/a/touch
|
.../input/y/click
| .../input/b/click
|
.../input/y/touch
| .../input/b/touch
|
.../input/menu/click
| .../input/system/click (might be unavailable for app use)
|
Path for haptic output is .../output/haptic
.
This interaction profile represents the input sources and haptics on the Meta Quest Touch Pro controller. This is a superset of the existing Meta Quest Touch Controller Profile.
Path: /interaction_profiles/facebook/touch_controller_pro
Path: /interaction_profiles/facebook/touch_controller_pro
…/input/squeeze/value
…/input/trigger/value
…/input/trigger/touch
…/input/thumbstick
…/input/thumbstick/x
…/input/thumbstick/y
…/input/thumbstick/click
…/input/thumbstick/touch
…/input/thumbrest/touch
…/input/grip/pose
…/input/aim/pose
…/output/haptic
Additional supported paths enabled by this profile for both /user/hand/left
and /user/hand/right
:
…/input/thumbrest/force
…/input/stylus_fb/force
…/input/trigger/curl_fb
…/input/trigger/slide_fb
…/input/trigger/proximity_fb
…/input/thumb_fb/proximity_fb
…/output/trigger_haptic_fb
…/output/thumb_haptic_fb
Hand specific:
Only on /user/hand/left | Only on /user/hand/right |
---|
.../input/x/click
| .../input/a/click
|
.../input/x/touch
| .../input/a/touch
|
.../input/y/click
| .../input/b/click
|
.../input/y/touch
| .../input/b/touch
|
.../input/menu/click
| .../input/system/click (might be unavailable for app use)
|