The Interaction SDK is used to add specific interactions like pokes or grabs to Meta Quest inputs like controllers or hands.
In Discover, the AppInteractionController script is added to each user’s camera rig. This class initializes interactors for Grab, Pinch, and Release from the Interaction SDK.
In DroneRage, the Interaction SDK is used in two major areas: the laser gun trigger pull and menus.
Grabbing the gun
The guns are grabbable objects that are oriented into the hands by the Interaction SDK.
As a design consideration, for hand tracking, the guns could be automatically attached to the user’s hands. This would mean, however, that trying to pull the trigger would open a menu instead of triggering a shot.
To work around this and get the trigger working, a currentGrabInteractor is defined based on whether the user is using hands or not. This logic is defined in WeaponInputHandler class in /Assets/Discover/DroneRage/Scripts/Weapons/WeaponInputHandler.cs:
Using the Interaction SDK’s ForceSelect() function on the current GrabController results in the player automatically grabbing the object with the following line:
Then, in Update(), in every frame, it checks whether the user is using hands:
if (useHands != m_usingHands || activeStateChanged)
{
UpdateInteractor(useHands);
}
If the weapon is not grabbed, the player automatically grabs it.
Pulling the gun’s trigger
For the user to pull the weapon trigger, the HandGrabAPI on the interactor is called in Update(). This captures the FingerPalmStrength of the index finger.
var triggerStrength = m_currentGrabInteractor.HandGrabApi.GetFingerPalmStrength(HandFinger.Index);
If the trigger strength is above a set threshold, the trigger is held and the gun will fire. This works so that if the user holds the trigger down, the gun will continuously fire. If they hold it down then let it go, the gun only fires once. This is true for both the machine gun and the pistol that the player may be holding.
Because this uses the HandGrabAPI, it works on both hand grab and the controller through different Interactors.
if (m_weapon.HandGrabInteractable != null && m_currentGrabInteractor != null &&
m_currentGrabInteractor.HasSelectedInteractable)
{
var triggerStrength = m_currentGrabInteractor.HandGrabApi.GetFingerPalmStrength(HandFinger.Index);
isTriggerHeldThisFrame = triggerStrength > m_triggerStrengthThreshold;
UI and Menus
The UI and Menus use direct touch and RayInteractors using the Interaction SDK.
The RayInteractor is set up on left and right controllers or hands in the AppInteractionController class, defined in /Assets/Discover/Scripts/AppInteractionController.cs. These interactors are initialized and the controller meshes are loaded.
The IconController class, defined in /Assets/Discover/Scripts/Icons/IconController.cs manages highlighting icons and haptics when hovering over or selecting icons based on whether the ray is colliding with those icons.
For debugging purposes, the surfaces of all Scene objects like walls and desks have Ray interactables, so that a cursor appears when the input is moving over them. This can be used to check that the app knows where the real-life wall is.
App placement flow
When selecting an app from the menu, the app must be placed in the player’s space. The AppIconPlacementController class, defined in /Assets/Discover/Scripts/AppIconPlacementController.cs controls the placement flow.
This is also where app placement is managed, using the StartPlacement function:
This uses a RayInteractor to create a ray that has a filter on. It checks the Scene element to figure out what type of element the input is moving over.
For example, certain apps have to be on a desk. This is disabled in DroneRage, but it is useful when you want to query what type of object each one is.
Locomotion
Locomotion for both controllers and hands has been reduced because Mixed Reality apps occur within a user’s space, unlike in larger virtual reality scenes where there is more space.
Discover uses standard controller locomotion, where using the joystick allows the user to move around and teleport. Hands use a gesture that is built using built-in prefabs from the Interaction SDK.