Interaction SDK
With this v37 SDK release, we’re announcing the availability of Interaction SDK Experimental. The SDK makes it easy to build and integrate hand and controller interactions into your apps, through interaction components that are robust, modular and composable. Interaction SDK is more flexible than a traditional interaction framework—you can use just the pieces you need, and integrate them into your existing architecture. Of course, you can use it standalone too.
Some of the capabilities within Interaction SDK Experimental include:
Grab, Resize & Throw: Interact directly with virtual objects that can be resized, thrown, passed from hand to hand, and even summoned and grabbed at a distance. The grab capability works with world constrained objects like levers, too.
Hand Grab: Hands can be made to conform to virtual objects. Our tooling makes it easy for you to build poses which can often be a labor intensive effort.
Pose Detection: Easily build custom gestures using per finger information like finger curl and flexion and support multiple gestures.
Direct Touch: Build virtual surfaces with buttons that feel great. The button poke mechanics are resilient to false and missed activations. Buttons also simulate haptics and physicality through “touch limiting”, which prevents your virtual hand from going through buttons. Our scrolling gesture allows you to easily create more sophisticated 2D UI, including on curved surfaces.
Targeting and Selection: Implement similar targeting and selection mechanics as seen in the system UI, including on curved surfaces.
Over the past few months, we’ve worked with several studios to integrate hands into their apps. Through their feedback, they’ve helped us create an SDK that’s easy to use and addresses many of the challenges of building hand interactions. Several of these studios have successfully integrated the SDK already (in pre-releases). Here’s what a few of our partners have been doing:
Chess Club VR by Odders Lab is a multiplayer chess app. They use the Interaction SDK for their main grabbing mechanic:
”We are using the Interaction SDK to overhaul our chess piece grab gesture. The improvement to the grabbing experience is remarkable. Although the gesture detection looked a bit difficult to implement at the beginning, the speed and accuracy of the detection is outstanding.” -Odders Lab
ForeVR Darts by ForeVR Games is a multiplayer arcade darts experience. They use the Interaction SDK for several of their gestures, notably their custom locomotion pose, which they use Pose Detection for.
“We leveraged the Interaction SDK for detecting hand poses in ForeVR Darts which has unlocked the ability for players to not only play with their hands, but locomote and interact throughout the world and with other players. Our fan favorite fist bump and high five animations come alive, plus hand gestures power player’s ability to teleport, rotate, and interact with menus.” -ForeVR Games
Finger Guns by Miru Studio is a fun shooter experience where your hands double as guns. They use the Interaction SDK on several interactions, like picking up cards from the DocStation.
“The Interaction SDK has allowed us to spend less time fine tuning hand interactions and deliver best in class mechanics out of the box. For us this has been particularly useful while implementing gestures like button presses and object grabbing.” -Miru Studio
With Interaction SDK, we hope to make it much easier for you to build great content and continue pushing the boundaries of VR interactions.
Tracked Keyboard
Today, we are releasing our
Tracked Keyboard SDK. Our Tracked Keyboard technology allows users to bring physical keyboards with them into VR, offering a seamless typing experience. We use computer vision tracking to locate and render keyboards, Bluetooth to receive keyboard output (optional), and a combination of Hand tracking and Passthrough to render hands on top of keyboards. We currently support two keyboard models: the Logitech K830 and the Apple Magic Keyboard, with more models coming soon. The SDK allows you to easily integrate this capability into Unity and Native apps.
The keyboard models provided by the API can be integrated into your app scene, allowing you to build rich immersive typing experiences, similar to the keyboard integration seen in
Workrooms.
We’ve been working with a handful of teams over the past few months on getting this
API ready for prime time, like
vSpatial, who have been adding keyboards to their workspace productivity app: “When it comes to being truly productive, the keyboard is king. To finally bring the natural feeling of typing on a keyboard to a VR headset that doesn't require calibrating and shows the detail of all the keys and your hands is exactly what users need to be productive. vSpatial users can now just bring a headset and a keyboard with them to another location and be able to work like they're using a huge multi-monitor setup without the need to lug around their monitors or even a computer. We're extremely excited about the path forward on this as it means that third party developers won't need to create their own keyboard models and deploy their own solutions.”
Daniel Platt, vSpatial Co-Founder & Director of Product
This new SDK is an important step in unlocking a broad range of new use cases to VR, and we can’t wait to see all the amazing things you build using this new capability!
More Presence Platform capabilities coming soon
We’re excited to see the mixed reality experiences you build with Presence Platform for Quest devices. With these new capabilities, and even more coming soon, we can begin to explore what the metaverse might look like, and we’re committed to helping you create the connected, interoperable worlds that lie ahead.