After completing this section, the developer should:
Understand how to set up their Unreal project to use the Movement SDK including the use of Link.
Understand how to set up permissions for eye tracking, face tracking or body tracking.
To set up a Movement SDK feature (body, face, or eye tracking) on your Unreal project for Meta Quest, you must first properly configure your project for Unreal development.
Unreal project setup
Prerequisites
To use Movement SDK for Unreal, the following are required:
A Meta Quest Pro headset for eye tracking and visual-based face tracking.
A Meta Quest 2, Meta Quest 3, or Meta Quest Pro headset for body tracking and audio-based face tracking.
Horizon OS v60.0 or higher.
Unreal 5.0 or newer installed, (Unreal 5.4 recommended).
An installed version of the Meta XR plugin for Unreal.
If you are not familiar with setting up an Unreal project that builds and runs on your headset:
Follow the Creating Your First Meta Quest VR App in Unreal Engine tutorial. This is the basis to create, configure, and build from scratch a very basic Unreal project that runs on your headset. After establishing the project, ensure that you confirm the following:
An installed version of the Meta XR plugin for Unreal.
Run the Meta XR Project Setup Tool and apply all required and recommended rules.
Note: The OpenXR, OpenXREyeTracker, OpenXRHandTracking, OpenXRMsftHandInteraction and OpenXRViveTracker are not compatible with Movement SDK and should be disabled. The Meta XR plugin will provide the OpenXR compatibility.
OculusXRMovement plugin
Finally, you must install the OculusXRMovement plugin. This plugin is necessary to support the tracking services. You can read more about the details of this plugin in OculusXRMovement Plugin Reference. Currently, the OculusXRMovement plugin is distributed in source. As such, you must recompile your project to include the plugin. Follow these steps:
Locate the OculusXRMovement plugin within Unreal-Movement/Plugins/ and copy it over to your own project’s /Plugins/ folder.
Ensure that your Unreal Project is a C++ project. You can convert it to one by adding a new C++ class to your Blueprint project by using Tools > New C++ Class.
Recompile your project with a C++ IDE (such as Visual Studio).
Rebuild and open your project.
Movement SDK configuration
Step 1: Enable Tracking Features.
To enable the tracking features of the Meta XR plugin you must enable them in the project settings.
Go to Edit > Project Settings.
Scroll down to the Plugins header and select the Meta XR subheading.
Under Mobile, scroll down and enable the tracking features you want to use:
Plugins menu with body, eye, and face tracking selected.
Under General make sure that XR API is set to Oculus OVRPlugin + OpenXR backend.
Step 2: Setup Android Permissions
Unreal Engine exposes a Blueprint API to manage permissions see documentation. Your application must have the following permissions granted to utilize the tracking features in Movement SDK:
Body tracking: com.oculus.permission.BODY_TRACKING,
Face tracking: com.oculus.permission.FACE_TRACKING and android.permission.RECORD_AUDIO, and for
Eye tracking: com.oculus.permission.EYE_TRACKING.
In your GameMode (or GameInstance) create a string array variable and name it Permissions.
Make the variable Type an Array container by selecting the icon to the right of the variable type in the Variable Detail panel.
Compile and add the relevant permissions to the Permissions variable.
After the BeginPlay event, add a Request Android Permissions node to trigger the permission request.
Compile and Save the GameMode (or GameInstance).
If you are using PCVR and want to use Link for play-in-editor:
Open your Link app and go to Settings > Beta.
Enable Developer Runtime Features.
Enable Eye Tracking over Meta Quest Link.
Enable Natural Facial Expressions over Meta Quest Link.