Avatars are how users represent themselves in the digital world. In fully immersive experiences like Virtual Reality (VR) and Mixed Reality (MR), avatars also represent a user’s body and their movements. Follow these best practices to display a user’s avatar and associated movements comfortably and believably in immersive environments.
User’s avatars usually reflect the user’s position, movement, and gestures. Users can view their virtual bodies and see how others perceive and interact with them.
Avatars can give users a strong sense of scale and of their body’s volume in the virtual world. The virtual avatar should be present with realistic proportions in relation to the scene for a comfortable experience.
Avatar movements should feel natural, plausible and comfortable for both the mover and those observing the movements. Enable avatar legs and include animations for standing, walking, and jogging for the lower body where movements are not tracked. For a first-person perspective, prioritize tracking accuracy so that users do not notice a discrepancy between their real-life movements and those of the avatar, which can make movements uncomfortable and unnatural. Focus on pose accuracy from the headset, hands, and controller input over animation quality. For third-person poses, prioritize animation quality. Emphasize natural and expressive movement over tracking accuracy.
Presenting an avatar body that contradicts a user’s proprioception, such as showing a walking body while the user is seated, can be uncomfortable. Users generally react positively to seeing their virtual bodies conforming to their physical pose. User testing and evaluation are crucial to determining if and how avatars will work for your application.
Any objects attached to the avatar, such as weapons or tools, should integrate seamlessly with the avatar, allowing the user and observers to feel as if the avatar is holding them. Test weapons or tools against various body types, clothing options, and hairstyles. Ensure assets look and behave consistently across all body types.
Developers using input devices for body tracking should track the user’s hands or other body parts and update the avatar with minimal latency.
Research indicates that providing users with an avatar that predicts upcoming motion helps them prepare, reducing discomfort. This is beneficial in 3rd-person games. If the avatar’s actions, such as a car turning or a character running, consistently predict camera movements, users can better anticipate changes in the virtual environment, leading to a more comfortable experience.