Adding haptics to your VR game or app brings depth and familiarity with the physical world by engaging people’s sense of touch, enabling truly natural interactions with nuanced notifications, realistic clicks, and lifelike textures. Historically, it’s been challenging for sound designers and developers to create haptic experiences for virtual, augmented, and mixed reality apps due to the lack of proper tools—until now.
Today, we’re excited to announce the experimental release of
Meta Haptics Studio and
Haptics SDK for Unity—two new tools enabling you to quickly design, test, and integrate best-in-class haptic experiences into VR games and apps. The Haptics SDK was built for VR developers, helping you solve common challenges when integrating high quality haptics into your experiences, while
Haptics Studio, a desktop application for Mac and Windows (with an accompanying VR app), allows sound designers and developers to create and audition haptics for Meta Quest.
Dive in below to learn about key tool features with Haptics studio and how you can get started integrating high quality haptics into your VR experiences.
Fast, High-Quality Results
Both Haptics Studio and the Haptics SDK for Unity have been designed for quick results, without compromising on quality. To get started, explore the haptic library or learn the basics of haptic design with the built-in interactive tutorial. Next, convert your own audio instantly into haptic feedback. You can also bulk import audio files to work across multiple haptic clips at once and use creative analysis parameters to quickly iterate on different variations. Once finished, use bulk export to save all of your haptic files for integration.
Design and audition haptics easily with Haptics Studio
Reduce Iteration Time with Instant Testing
Haptics Studio can help you reduce iteration time by allowing you to immediately feel the edits you make on your controllers—there is no need to code or build your app in order to test haptics. All updates and changes to the haptic clips made in Haptics Studio are synchronized over WiFi with the companion VR App. You can even test your haptics without needing to don and doff your headset with every change. Feel a response immediately on the skin surface through vibration that replicates immersive touch feelings like clicking real buttons and textures.
Author Once for Current and Future Meta Quest Devices
Haptic experiences exported from Haptics Studio are saved in a hardware agnostic file format (.haptic). This means when haptics are played back on any current (Meta Quest 2 or Meta Quest Pro) or future Meta Quest controller, the haptics will be best adapted to the capabilities of the hardware. This ensures that haptics do not need to be redesigned or implemented again, and saves you time during the development process by helping to guarantee your users get the best experience with their Meta Quest devices.
You can export your haptic designs from Haptics Studio and use them with the Haptics SDK for Unity
Haptics SDK for Unity
Haptics SDK for Unity was designed specifically to solve the challenge of integrating haptics into Unity projects by providing a high-level, flexible, media-like API. Your Unity events can easily trigger haptics via the SDK APIs, such as Play, Stop and Loop. Haptic playback can also be modulated in real time, such as using the amplitude control to dynamically change the intensity of a haptic clip. This allows you to easily create variations on haptics or change the intensity of haptics based on changing game context, such as a player's distance to an object. No matter what level of experience you have in integrating haptics, the Haptics SDK will help you quickly create a sense of immersion and presence in your experiences.
Get Started
To get started, check out our documentation: