Haptics
Gain an overview of the essential tools for designing and integrating haptic feedback, along with an introduction to the fundamental principles of haptic design. Whether new to the field or looking to refine an approach, this documentation will provide the knowledge and resources needed to create engaging and effective tactile experiences.
| On this page, explore Meta Haptics Studio (Mac | Windows), a powerful tool designed for creators to effortlessly design and test high-quality haptic effects for immersive experiences. With Meta Haptics Studio, craft haptics without needing any coding or programming expertise. |
| This is the Meta Haptics SDK for Unity documentation and resources page. Here, find everything needed to integrate haptic feedback into Unity projects. |
| On this page, find documentation and resources for the Meta Haptics SDK for Unreal, which allows one to integrate haptic feedback into Unreal projects. |
Haptic feedback, also known as “haptics”, is a technology that simulates the sense of touch by applying forces, vibrations, or motions to the user’s skin through hardware devices.
It is commonly used in gaming controllers, mobile phones and immersive experience devices to create more engaging and interactive experiences.
Haptics are a natural part of our everyday lives, and we experience them in various ways. For example, when driving a car or attending a music event, the sounds can be strong enough to make us feel vibrations on our skin. In other cases, we both hear and feel things, like when writing on paper or walking on gravel, which are known as audio-tactile events.
Human beings have five senses, but electronic devices and digital products mainly communicate with us using just two: sight and hearing.
Haptic feedback changes that by simulating the sense of touch to communicate with users, making it easier to understand human to device interactions and creating more immersive, realistic, and magical experiences.
In immersive applications there are four main use cases for haptics:
Peoples attention
Use haptics for silent notifications: Different vibration patterns can signify different types of notifications, such as calls, messages, or app alerts.
Example:
Haptics help to draw immediate attention and convey urgency. The vibration can be adjusted in intensity, frequency, and rhythm to convey the urgency or severity of the notification or alert.
System feedback
Use haptics feedback for user input, such as gesture detection. Ensure that haptics complement visual and auditory feedback, and pay attention to synchronization between the three to create a seamless and immersive experience.
Example:
Creating Tactile Buttons: When using touch screens or virtual interfaces, haptics can be integrated to replicate the feeling of pressing physical buttons
Accessibility
Haptics can compensate for a lack of visual or auditory feedback, making digital applications more accessible and engaging for users who are visually or hearing impaired, or when their attention is not focused on the screen.
Example:
Distinct vibrations can indicate different types of notifications or actions, helping users understand what’s happening without relying on sight or hearing. When having a phone in the pocket, we know if it's a call, a message or an alarm.
Enhance immersion
In virtual or augmented reality environments, haptics provide tactile feedback that enhances immersion and realism. We're so used to haptic feedback in our interactions that its absence can make the experience feel incomplete.
Example:
In immersive applications, haptic feedback becomes a tool for simulating the tactile experience of interacting with virtual objects like holding a cup or typing on a virtual keyboard, adding the sense of touch to the immersive environment.
Meta Haptics Studio (
Mac |
Windows) is a desktop application and a companion app on Meta Quest that allows creators and developers to design and audition haptic feedback. Designs can be exported as haptic clips and played in an app using the Meta Haptics SDK (
Unity |
Unreal |
Native).
Key features and capabilities - Design and audition:
Easily design haptic clips and audition them in real-time to ensure they match the desired outcome.
- Export and integration:
Seamlessly export haptic designs and integrate them into immersive applications using the Meta Haptics SDK.
- Cross-device compatibility:
Ensures that all haptic feedback is compatible across various Meta Quest devices, simplifying the development process.
- Import audio effects into Meta Haptics Studio, audio-to-haptic analysis is performed automatically to generate haptics from the audio file.
- Design haptics using the visual editor by either changing the analysis parameters or manually editing the haptic clip(s).
- Audition clips in real time on a headset with the companion app on Meta Quest. Any design changes propagate in real time to the immersive app.
- Export the final clip(s) to a project folder.
- Integrate the
.haptic
files into an application using Haptics SDK.
Meta Haptics SDK provides a unified, high-level, media-based API for playing haptic clips authored in
Haptics Studio on Meta Quest controllers. The SDK detects the controller at runtime and optimizes the haptic pattern, maximizing the controller’s capabilities. This feature ensures haptic clips are both backward and forward compatible with Meta Quest devices.
Downloads and documentation Designing the invisible: a creative approach to haptics
To design haptics, it’s essential to understand the capabilities and limitations of the underlying hardware.
Haptics rely heavily on the design and engineering of actuators—the core hardware components responsible for generating tactile feedback. Actuators convert electrical signals into vibrations or forces, allowing users to feel and interact with virtual environments or devices. The precision, responsiveness, and efficiency of these actuators are crucial in determining the quality and realism of the haptic experience.
Enhanced haptics have evolved from simple vibration alerts to sophisticated, multi-dimensional feedback systems, because of the development in actuators.
Let’s have a look at the most common haptic actuators and the evolution of haptics on Meta Quest devices:
VCM: voice coil motor
A voice coil motor (VCM) is a type of electromagnetic actuator that excels in providing precise and controlled tactile feedback, from gentle vibrations to sharp, defined impacts, thanks to its wide frequency range. As a result, VCMs can create haptic feedback that feels realistic and closely mimics the desired tactile sensation. VCMs are slowly replacing ERM and LRA in applications where feedback is desired, such as mixed reality devices, mobile phones, and gaming controllers.
LRA: linear resonant actuator
A linear resonant actuator (LRA) creates vibration around a single resonant frequency. Unlike ERMs, an LRA can change its amplitude independently from frequency, allowing for more dynamic effects. LRAs tend to be small and energy efficient. However, they typically don’t provide the vibration intensity that large ERMs can produce.
ERM: eccentric rotating mass actuator
An eccentric rotating mass (ERM) actuator is a simple and cost-effective technology used for creating simple vibrations and tactile sensations. When activated, the ERM causes an unbalanced mass (usually a small weight) to rotate around an eccentric (off-center) axis. This rotational motion generates vibrations that can be felt by the user. ERMs produce vibrations that are less controlled and nuanced compared to LRA or VCM. The technology may not be able to produce sharp impacts or convey complex tactile sensations.
Meta Quest 2 came with what is known as an LRA or Linear resonant actuator. LRA’s can typically be found in devices like mobile phones and are set at a fixed frequency. This means only the amplitude or strength can be controlled, at which the motor vibrates.
Meta Quest Pro and Meta Quest 3 introduce what are known as VCMs, or Voice Coil Motors. VCMS are closer to speakers, in that they have a wide band frequency control, usually up to 500hz. These new wide band actuators enable entirely new creative possibilities, allowing for different use cases moving from basic feedback like a simple buzz on and off to enhanced feedback such as feeling the dials while turning a button.
These new wide band actuators enable entirely new creative possibilities:
| | | |
Devices | Meta Quest 2 | Meta Quest Pro | Meta Quest 3 |
Actuator | Narrowband LRA | Wideband VCM | Wideband VCM |
Capabilities | Single frequency with amplitude control: Simple signals | Freq and amplitude control: Sharp and precise clicks, complex signals | Freq and amplitude control: Sharp and precise clicks, complex signals |
Use cases | Basic feedback: Notifications, confirmation, event interactions | Basic and immersive feedback: Navigation, character and environment interactions | Same as Meta Quest Pro with slightly lower output intensity |
Meta Haptics Studio provides hardware-agnostic haptic design. Custom haptic effects designed in Studio work best on Meta Quest Pro, Meta Quest 3 and up. The effects are backwards compatible with Meta Quest 2.
Whenever users interact with a system, they need confirmation that their actions have been successful. Did the system register that button press? Providing immediate and clear feedback prevents users from, for example, tapping the same button multiple times due to uncertainty.
Offering appropriate feedback is one of the most fundamental principles of user-interface design. When users can easily understand the system’s state, they feel in control and can trust that the system will respond as expected. This predictability is key to building trust in the interaction.
Feedback can be as simple as a button changing color after being clicked or a progress indicator appearing when a process takes time to complete. However, effective feedback goes beyond visual cues. To create a more holistic and engaging experience, it’s essential to stimulate different sensory modalities—sight, hearing, and touch.
Each type of sensory feedback has unique characteristics and serves different purposes:
- Visual feedback: Color changes, progress bars, and animations provide instant visual confirmation of user actions.
- Auditory feedback: Sounds and beeps can alert users to important events or confirm actions without needing to look at the screen.
- Haptic feedback: Vibrations and tactile responses can simulate the sensation of pressing a physical button or notify users through touch. This is especially important when audio and visual channels are distracted.
By integrating these sensory feedback mechanisms, we can design interfaces that not only function predictably but also feel intuitive and responsive, enhancing the overall user experience.
Key characteristics of Haptic Feedback
Before designing haptics, let’s consider a few key unique characteristics of haptic feedback.
Intimate interactions
Haptic feedback relies on the sense of touch - a deeply personal and intimate experience. Haptic feedback adds an emotional layer to interactions as we have inherently trust our sense of touch.
Limited information transferral
Compared to visuals and sound, Haptics have constraints in conveying information. Overloading the touch sense with excessive information can negatively impact the user experience.
Short haptic memory
Our haptic memory is shorter than audio or visual memory. For haptic designers, A/B testing haptics is more difficult than comparing audio or visuals. For end-user experiences, we aim for recognition rather than recall to minimize the user’s memory load.
Less is more
Haptics add depth to any experience, but should not take center stage, unless the overall experience intent is to focus on haptics. Some of the most effective haptics are the ones that one only notices when they are missing.
Play in sync
Haptics work best in a multimodal context where they add an accent to an audio/visual experience. As designers, we need to consider synchronization between audio, visuals and haptics cue for users to perceive them as one event.
Below is a list of principles or recommended practices when designing haptic feedback:
Relate to action | Use haptic effects consistently and in a way that reinforces a clear causal relationship between the haptic and the action that causes it. Haptic feedback enhances user confidence in virtual interactions and keeps them informed about system status. Feedback should be tied directly to user actions and engage immediate attention, such as error prevention or notifications. It may also aid in providing guidance and orientation within a virtual space. |
Holistic feedback system | Use haptics in ways that complement other feedback in an app, such as visual and auditory feedback. Pay attention to synchronization between audio, video and haptics. |
Haptic balance | Avoid overusing haptics and instead aim for a balance that most people will appreciate.Carefully prioritize haptics in fast-paced action games and avoid long, overlapping haptic effects. Use Whitespace: Haptics are more impactful after a short pause where the skin can rest. Create pauses, set accents to highlight tiny moments. |
Give users a choice | Make haptics optional and adjustable. Users should have the choice to mute haptics, and the app experience should remain enjoyable without them. Allow users to customize the intensity of haptic feedback, recognizing the individual user preferences and sensitivity to touch. |
Dynamic effects | Design custom haptic effects that vary dynamically based on user input or audio-visual context. Softer sounds should correlate to softer haptics. |
Consistency and standards | Ensure consistency in haptic interactions across an application to facilitate user learning and association of haptic patterns with specific experiences. Focus on delivering the right amount of feedback at the right time, using easily recognizable and distinguishable sequences. |
A structured process for designing haptic experiences
Haptic design is a relatively new discipline that often encounters challenges such as being given low priority in project planning, treated as an afterthought in the design process, or overlooked due to creators’ unfamiliarity with the necessary tools. However, many digital products rely on haptics, making it essential for professionals across various fields—such as product design, prototyping, sound design, and game design—to consider haptics as a core component of their work. In this section, we offer a methodical approach and templates to help guide haptic design projects, ensuring that haptics are thoughtfully integrated from the start.
A haptic design journey may fall into one of two categories:
1. Introducing haptics: This pertains to projects starting from scratch, without any prior haptic elements.
2. Upgrading haptics: This involves enhancing or replacing primitive haptic feedback in existing projects.
Each category carries its unique challenges. While existing projects with primitive haptics may set some parameters, new projects offer greater creative freedom. Nevertheless, the process outlined here is applicable to both project types.
This phase sets the stage for informed decision making throughout the design journey.
1. Identify user needs | Start by pinpointing the user’s genuine haptic requirements. Look for opportunities where haptics can compensate for limited or absent sensory cues. An example for user needs in an immersive fitness application: Users want to feel powerful, users want feedback on form. |
2. Define design intent | Clearly define the areas of an experience that require haptic feedback and create a hierarchy around haptics. Avoid diving into the production of haptics without a well-defined purpose. |
3. Internalize interaction patterns | Get a clear understanding of the interaction patterns and feedback mechanisms. Test prototypes early to get a sense for the interactions in context. |
4. Design for multisensory experiences | Consider haptics as part of a broader feedback system. Haptic feedback is most effective when integrated with other sensory inputs. Multisensory experiences enhance user reactions, task completion, and learning. Ensure that haptic feedback complements visual and auditory cues, creating a cohesive and natural user experience. |
5. Consider hardware | Understand the device being designed for. Recognize its capabilities and limitations. |
6. Define design approach | When designing, start from physical world metaphors and expected behaviors. However, instead of replicating the exact sensations of the physical world, imagine metaphors to guide a design. How would actions and interactions unfold in the physical environment? How do elements interact with a user and their surroundings? Even in familiar experiences, don’t be constrained by physical limitations; create magical, unreal sensations. The decision whether to design haptics for expanding perception or building realism is tightly coupled with the overarching goal of the game’s user experience (UX) and the decisions made by the audio and visual teams. It is crucial to closely coordinate with a team to ensure that the haptic design direction aligns harmoniously with the overall design elements of the app.
Expanding perception Extend the natural sense of touch and use haptics to create digital illusions. This approach involves crafting unique sensations and emotions that go beyond the boundaries of the physical world.
Building realism Replicate the tactile sensations of the physical reality, bringing familiarity to immersive applications or games, that feel as close to the physical environment as possible. |
7. Consider body positioning | Align haptic interactions with the body’s natural models. Map specific types of haptic information to corresponding areas of the body. Remember that haptic feedback may generate audible sounds, especially when the device is in contact |
Our
github repository offers a range of example projects that demonstrate how to apply the concepts we’ve discussed. Additionally, utilize our design templates to effectively structure and deliver a haptic design project.