Add Haptics
By the end of this guide, you’ll be able to:
- Explain how to initialize the SDK and integrate with OpenXR, load and play a
.haptic
clip exported from Meta Haptics Studio by using the Haptics Native SDK
Initializing the SDK and integrating with OpenXR
The SDK uses OpenXR to trigger vibrations, which requires setting up, binding, and attaching an OpenXR action. Follow these steps:
- When you create the OpenXR instance with
xrCreateInstance()
, include the extensions needed by the SDK. You can get the names of the required extensions with haptics_sdk_get_openxr_extension_count()
and haptics_sdk_get_openxr_extension()
. - After creating the OpenXR instance, pass it to the SDK by calling
haptics_sdk_set_openxr_instance()
. - After creating the OpenXR session, pass it to the SDK by calling
haptics_sdk_set_openxr_session()
. - When you set up your OpenXR actions, pass an action set to the SDK by calling
haptics_sdk_set_openxr_action_set()
. The SDK will add a new action to it, which the SDK will use as an output action for vibrations. You can either use a separate action set that is only used by the SDK, or use the action set that also contains other application actions. - Bind the output action by including the SDK’s suggested bindings in your call to
xrSuggestInteractionProfileBindings()
. Use haptics_sdk_get_openxr_suggested_binding_count()
and haptics_sdk_get_openxr_suggested_binding()
to get the suggested bindings. - Attach the action set to the OpenXR session with
xrAttachSessionActionSets()
. - To save power when the headset is idle, whenever the OpenXR session state changes, call
haptics_sdk_set_openxr_session_state()
with the updated state.
When your application shuts down, you should release the resources used by the SDK by calling haptics_sdk_uninitialize()
.
You can look at the example app to see how to initialize the SDK. See the file example_app/app/src/main/cpp/program.cpp
, in particular the Program::setupOpenXr()
function. In summary, your code should include the following (error handling is omitted for length):
// Step 1: Include the extensions used by the SDK in the call to
// xrCreateInstance()
std::vector<const char*> extensions;
int32_t extension_count = 0;
haptics_sdk_get_openxr_extension_count(&extension_count);
for (int32_t i = 0; i < extension_count; i++) {
extensions.push_back(haptics_sdk_get_openxr_extension(i));
}
XrInstanceCreateInfo instanceCreateInfo = { [..] };
instanceCreateInfo.enabledExtensionCount = extensions.size();
instanceCreateInfo.enabledExtensionNames = extensions.data();
XrInstance instance = XR_NULL_HANDLE;
xrCreateInstance(&instanceCreateInfo, &instance);
// Step 2: Let the SDK know about the OpenXR instance
haptics_sdk_set_openxr_instance(instance);
// Step 3: Let the SDK know about the OpenXR session
XrSession session = XR_NULL_HANDLE;
xrCreateSession(instance, [..], &session));
haptics_sdk_set_openxr_session(session);
// Step 4: Pass the action set in which the SDK will create its
// action
XrActionSet actionSet = XR_NULL_HANDLE;
xrCreateActionSet(instance, [..], &actionSet));
haptics_sdk_set_openxr_action_set(actionSet);
// Step 5: Include the SDK's suggested bindings in the call to
// xrSuggestInteractionProfileBindings()
std::vector<XrActionSuggestedBinding> bindings;
[..]
int32_t binding_count = 0;
haptics_sdk_get_openxr_suggested_binding_count(&binding_count);
for (int32_t i = 0; i < binding_count; i++) {
bindings.push_back(
haptics_sdk_get_openxr_suggested_binding(i));
}
XrInteractionProfileSuggestedBinding suggestedBindings= { [..] };
suggestedBindings.suggestedBindings = bindings.data();
suggestedBindings.countSuggestedBindings = bindings.size();
xrSuggestInteractionProfileBindings(instance,
&suggestedBindings));
// Step 6: Include the action set that you passed to the SDK in the call to xrAttachSessionActionSets()
XrSessionActionSetsAttachInfo attachInfo = { [..] };
attachInfo.countActionSets = 1;
attachInfo.actionSets = &actionSet;
xrAttachSessionActionSets(session, &attachInfo);
// In the main loop
while (1) {
XrEventDataBuffer eventDataBuffer = { [..] };
xrPollEvent(instance, &eventDataBuffer);
switch (eventDataBuffer.type) {
[..]
case XR_TYPE_EVENT_DATA_SESSION_STATE_CHANGED: {
auto sessionStateChangedEvent =
(XrEventDataSessionStateChanged*)(&eventDataBuffer);
// Step 7: Let the SDK know about OpenXR session state
// changes
haptics_sdk_set_openxr_session_state(
sessionStateChangedEvent->state);
break;
}
}
[..]
}
To play a haptic clip, you need to load it first. Include the .haptic
file that contains the haptic clip you want to load in the application’s package. Then, read the file content into a char array and pass it to haptics_sdk_load_clip()
. The returned ID identifies a clip. The clip can be used by multiple different clip players. For example, one player can play on the left controller and another player can play on the right controller.
// myReadFile() is a placeholder for the API you are using to
// read file content
const char* hapticClipData = myReadFile("myClip.haptic");
int32_t clipId = HAPTICS_SDK_INVALID_ID;
haptics_sdk_load_clip(hapticClipData.data(),
strlen(hapticClipData), &clipId);
Once you no longer need the clip, use haptics_sdk_release_clip()
to release the clip’s resources.
Once you have loaded a clip, you can use a haptics_sdk_create_player()
to create a clip player. Afterwards, assign a clip to the player with haptics_sdk_player_set_clip()
and start playback on a specific controller (left, right, or both) with haptics_sdk_player_play()
.
int32_t playerId = HAPTICS_SDK_INVALID_ID;
haptics_sdk_create_player(&playerId);
haptics_sdk_player_set_clip(playerId, clipId);
haptics_sdk_player_play(playerId, HAPTICS_SDK_CONTROLLER_RIGHT);
You can either let the playback finish on its own, or call haptics_sdk_player_stop()
to end playback early.
To swap out the clip used by a player, you can use haptics_sdk_player_set_clip()
and assign a new clip to be played back. If a clip is currently playing, it will be stopped by this call.
Once you no longer need the player, use haptics_sdk_release_player()
to release the player’s resources.
Only one player can trigger vibrations on each controller at a given moment. If you have multiple players set to play on the same controller at the same time, then only the player with the highest priority will trigger vibrations. If the players have the same priority, the player started last will trigger vibrations. You can set a player’s priority by calling haptics_sdk_player_set_priority()
.
When a player starts playback, it will interrupt a currently active player of the same or lower priority. Once the player finishes playback, the player with the next highest priority will resume triggering vibrations.
Use haptics_sdk_player_set_looping_enabled()
to enable or disable looping of clip playback. You can call this before or during playback. Looped playback will continue indefinitely until you call haptics_sdk_player_stop()
or disable looping again.
haptics_sdk_player_set_looping_enabled(playerId, true /* enabled /*);
You can change the strength or frequency of a vibration during playback by calling haptics_sdk_player_set_amplitude()
and haptics_sdk_player_set_frequency_shift()
. You can call these methods before and during playback. By calling these methods multiple times during playback, you can change the amplitude and frequency continuously over time.
haptics_sdk_player_play(playerId, HAPTICS_SDK_CONTROLLER_RIGHT);
haptics_sdk_player_set_amplitude(playerId, 0.5f);
A ‘spatial effect’ in haptics creates the illusion that sound or tactile sensations originate from a specific location in three-dimensional space. This technique enhances the realism of virtual environments or multimedia experiences by making it appear as though the feedback is emanating from a distinct point relative to the user.
Adjust the amplitude of a haptic clip based on the relative distance between the player character and the object or vibration source.
As the distance increases, the amplitude decreases, mimicking the natural decrease in intensity.
Example: Imagine a player character walking away from a vibration source, such as a car with its engine idling.
‘Panning’ is a method used to distribute a signal across a stereo or multi-channel sound field. It is widely used in audio production to manage the perceived location of a sound source within the stereo field—be it left, right, or anywhere in-between. In haptics it is related to distributing a haptic effect between the two controllers.
Achieve panning effects by playing the same haptic effect on both controllers and independently modulating the amplitude to simulate movement across the stereo field.
Introduce slight variations in amplitude and frequency modulation for repetitive haptic events like footsteps or gunshots to enhance realism.
Generative haptics’ involves creating haptic effects that are dynamically generated in real-time based on algorithms or user interactions. These haptic patterns are adaptable and evolve according to the context or environment, offering a more responsive and immersive experience.
For dynamic scenarios requiring high variation, such as a paintbrush moving across a canvas or a foot pressing a car’s accelerator, generate haptic effects by looping a haptic clip with fixed amplitude and frequency, then modulating playback in real-time based on user input.
Make sure to call haptics_sdk_set_openxr_session_state()
whenever the OpenXR session state changes. When the session is not focused, for example, because the user took off their headset, the SDK will cease any processing on the haptics engine thread, reducing power consumption. All haptic playback will pause when the session is not focused, and playback resumes at the same position when the session becomes focused again.
Only start playing haptic clips when the OpenXR session state is XR_SESSION_STATE_FOCUSED
. In all other states, no vibrations will be triggered, and starting a new playback will not cause a vibration.
You can call all SDK functions from any thread; it doesn’t need to be the thread that contains your main loop, and it doesn’t need to be the thread that contains your other OpenXR calls. If you call multiple functions from more than one thread at the same time, subsequent functions will block and wait until the first function has finished.
Internally, the SDK starts two separate threads. You can observe these threads in a debugger. The threads are:
- A high-priority thread named “HapticsEngine”, used for smooth playback. You should call
haptics_sdk_set_openxr_session_state()
whenever the OpenXR session state changes to ensure the haptics engine thread pauses processing when the OpenXR session is not focused. - A low-priority thread named “HapticsBackground”, used for occasional background work.
Most functions return a HapticsSdkResult
, and you can use HAPTICS_SDK_FAILED()
to check if the returned value is an error.
To get more information about the error, you can check the specific error code, or get an error message by calling haptics_sdk_error_message()
. Additionally, the SDK logs the error message.
HapticsSdkResult result = haptics_sdk_load_clip([..]);
if (HAPTICS_SDK_FAILED(result)) {
const char* errorMessage = haptics_sdk_error_message();
// MY_LOG is a placeholder for your logging macro
MY_LOG("Loading clip failed: %s", errorMessage);
}
If you come across any issues, bugs, or if you believe adding a particular feature would be beneficial for other users, reach out to us at
haptics-feedback@meta.com.
Feel free to attach screenshots or log files (see below), but please avoid including personal information or other information you’re not comfortable sharing with us.
On the headset, the Haptics Native SDK logs messages to the Android log and uses the HapticsSDK
log tag. On Windows, the SDK logs to the standard output. Reference these logs to get more information when you experience issues.
To gather logs from the headset, follow these steps:
- Increase the log buffer size. Without doing this, the logs will often only contain the last few seconds, and important log messages like the one about the Haptics SDK initialization will be missing:
adb logcat -G 16M
- Reproduce the problem. It is important to reproduce the problem right before gathering the logs, so that the messages related to the problem are still contained in the limited log buffer.
- Pull the logs and save them to a file:
adb logcat -d > logs.txt
- Verify that the logs contain information related to the Haptics SDK. The following should show some Haptics SDK log messages:
cat logs.txt | grep HapticsSDK
- Ideally this goes back to the initialization of the Haptics SDK, which logs a message similar to this:
05-26 14:08:03.444 9451 9662 I HapticsSDK: haptics_sdk_c_api_core::c_api::helpers: Haptics SDK version <version> initialized