New Distribution Model
The Meta Avatars SDK is now available in the Unity Asset Store as a Unity Package Manager package. This will make future SDK upgrades easier than ever before.
Visual Enhancements
Avatars now look better with default settings, including more lifelike eye glints and more detailed hair and clothing with normal map textures.
We’ve also made performance optimizations - such as a new Compute Skinner - so that overall performance is similar as before.
New Customizable Shaders
This release includes a ground-up rewrite of our Avatar shader, called “Avatar-Meta”. We recommend all developers use this shader for the best possible visuals and performance.
Avatar-Meta is more human readable and editable than previous shaders, making it easy to customize the shader for your app while still integrating future improvements.
Deprecations
With this release, the minimum supported version of Unity is now 2021 LTS.
Quest 1 is no longer officially supported. If your application still supports Quest 1, see more details below.
Meta Avatars Unity SDK is now available in the Unity Asset Store as a Unity Package Manager package.
Prepackaged avatars are now distributed as a separate Asset Store package. If you use prepackaged avatars in your app, you will need to also install this second package.
Introducing a new Compute Shader skinning option, which uses a compute shader on the GPU to handle avatar skinning. This reduces avatar load times, reduces memory usage, and improves performance. It is recommended for all apps and is enabled by default.
This release includes a new default shader, “Avatar-Meta”, that is more human readable and editable. It will be automatically used if using the AvatarSDKManagerMeta prefab. If legacy shaders are used, a warning will be thrown. The code for this shader is in Avatar-Meta.shader and MetaAvatarCore.hlsl.
Avatar streaming now includes a built-in adaptive jitter buffer algorithm. Previously, apps needed to call SetPlaybackTimeDelay to define a playback time buffer which enabled avatar animation to smoothly interpolate between network updates. Now, this can be calculated automatically. This can be enabled by calling SetAutoPlaybackTimeDelay, however it is also enabled by default, so it is no longer necessary to call SetPlaybackTimeDelay at all unless you wish to customize the delay calculation for your app.
We recommend removing any code that calls SetPlaybackTimeDelay unless your app has a specific need to control this value.
ovrAvatar2EntityFeature_Rendering_ObjectSpaceTransforms has been removed. Object Space Transforms are now always available when ovrAvatar2EntityFeature_Rendering_SkinningMatrices is enabled, as they are now free from a performance perspective.
A new LightingExample scene has been added. It displays avatars in a variety of environments and lighting setups, which can help when debugging or modifying the shader.
Refactor Avatar Editor launch logic to use the new Oculus.Platform.CAPI.ovr_Avatar_LaunchAvatarEditor API. Consequently, the AvatarEditorDeeplink folder and the Newtonsoft JSON and IPC DLLs have been removed.
Avatars now use higher quality assets by default. A new quality setting is available in Avatar Entity > Creation Info > Render Filters > Quality.
Quality levels are Standard (new highest quality), Light, and Ultralight.
“Standard” is the new default and is a higher quality level than the default in v20. We recommend all apps use this setting for a better user experience.
“Light” corresponds to the old default quality level in v20, and may be selected if there are performance regressions.
“Ultralight” is a very fast-loading lightweight avatar that includes no textures, only vertex coloring. This is primarily used by the Fast Load feature (see below) but can also be used manually if desired.
Avatar “Fast Load” is now enabled by default. Fast Load causes an Ultralight (vertex colored) avatar to load temporarily until the full avatar is loaded, which improves user perception of avatar load times.
Assertion related to missing material when using AppSW without motion smoothing enabled
Verify NativeArray allocations in OvrAvatarPrimitive.cs to more gracefully handle allocation failures. Log errors when they occur.
Bug where OvrAvatarEntity would switch to SkinningType.NONE (static) upon loading a static primitive, but would not re-enable skinning if a skinned primitive was later loaded.
Stereo instanced rendering in Unity.
Broken references to tracking in MirrorScene.
Removed compile time warnings in the unity project.
Replacing the old "controller hand" animation clips.
Improve warnings for ET/FT in MirrorScene.
Removed the error spam caused by OvrAvatar2BodyTrackingContext where the number of XR devices connected is continuously spammed.
Fixes to Default Gaze Target System to prevent the avatar’s eyes from rolling backwards when the avatar is moving quickly.
Increase distance of Default Gaze Target slightly, and increase it further when the avatar is moving quickly.
Force update default gaze target when too close.
Crash when multiple http requests retry on the same tick
The color of the default models is now properly determined by the parameter "Default Model Color" in the OvrAvatarManager.
For this mechanism to work, integrators should activate "UseColorParameterBaseColorFactor", and specify "NameColorParameterBaseColorFactor", in their ShaderConfiguration.
The minimum supported version of Unity is now 2021 LTS.
GPU Skinning is now deprecated in favor of Compute Skinning, which significantly improves memory use, load times, and performance.
Shaders (see OvrAvatarShaderNameUtils.cs for details):
Recommended in favor of Deprecated shaders:
AvatarMeta
Deprecated:
AvatarLibrary
AvatarHorizon
AvatarHuman
AvatarKhronos
Defined as Reference Only (not suitable for production):
AvatarMobileBumpedSpecular
AvatarMobileCustom
AvatarMobileDiffuse
AvatarMobileVertexLit
AvatarStandard
Quest 1 is not supported in Meta Avatars SDK 24. If your application needs to support Quest 1 you may chose to remain in version v20.x or, if you decide to upgrade to v24, you may follow these instructions to ensure v24 compatibility. V24 uses a new underlying rig technology to drive the avatar, and therefore network streaming data is incompatible between v24 and previous versions. If you have used network recording as a strategy to playback animations on avatars, the recordings will break and need to be re-created in v24. However, our streaming data format should now be forwards-compatible for the foreseeable future.
The Avatar's eyes snap to the corner when the scene is reloaded in the GazeTracking and LightingExample sample scenes.
Non critical errors may appear in the console after switching Shader in Lighting Example sample scene.
[ovrAvatar2 native] RetargetLayer::body_to_runtime_rig riggraph wasn't found in the assets.
In the Mirror sample scene, the iris tends to drift towards the corners of the eyes when moving back and forth or left and right.
Mirror and Skinning Types sample scenes might crash when repeatedly reloading the scenes without letting the avatars load.
When running the Mirror sample scene in Quest Pro, the avatar has static face if declining face tracking and eye tracking permissions.
When running the Mirror sample scene with face tracking/eye tracking enabled in Quest Pro, the avatar's eye closes when the head or headset moves upward.