Develop

Meta XR Audio SDK spatial audio sample

Updated: May 11, 2026

Overview

This sample demonstrates core spatial audio features of the Meta XR Audio SDK through seven interactive scenes on Quest headsets. Each scene isolates a spatialization concept — directivity, point sources, ambisonics, room acoustics, volumetric sources, and HRTF intensity — so you can hear perceptual differences and inspect how each integrates with Unity’s audio system.

What you will learn

  • Project-level spatial audio configuration with Unity’s AudioManager and mixer
  • AudioSource directivity patterns that change sound based on emitter orientation
  • First-order ambisonic audio integration with head-rotation tracking
  • Real-time room acoustics using the Meta XR Audio Reflection mixer effect
  • Volumetric audio sources that emit from a spatial area instead of a single point
  • HRTF intensity control for localized versus diffuse sound perception

Requirements

  • Quest 2 headset with developer mode enabled
  • Unity 6 (6000.0.23f1 or later)
For required packages and setup instructions, see the Meta XR Audio SDK Unity setup guide.

Get started

Clone or download the sample from the GitHub repository. Open the MetaXRAudioSDK project in Unity, then open the Main scene from the project browser. For complete build and deployment instructions, see the sample’s README.

Explore the sample

The sample starts in a Main hub scene where you select a feature demo from a toggle list, read its description, and press the Launch button. Six scenes demonstrate individual audio features:
SceneWhat it demonstratesKey concepts
Directivity
Voice sources with directional patterns
AudioSource spatialization with directivity; sounds change based on whether the speaker faces toward or away from you
PointSource
Single point source with distance attenuation
Standard 3D audio positioning; sound volume decreases as you move away from the guitar
Ambisonic
First-order ambisonic environmental audio
Ambisonic decoder integration; sounds rotate with head movement but remain fixed in world space regardless of listener position
RoomAcoustics
Reflections and reverberation in an enclosed space
Meta XR Audio Reflection mixer effect; room parameters control early reflections and reverb tail
Volumetric
Helicopter audio emitting from a larger volume
Volumetric source configuration; sound originates from an area encompassing the helicopter model instead of a single point
HRTFIntensity
Comparison of high versus low HRTF intensity
HRTF intensity control; high intensity produces localized point sources, low intensity produces diffuse spatial audio

Runtime behavior

When you launch the sample, you see the Main hub scene with a list of feature demos. Toggle a scene name to read its description, then press the Launch button. Each demo scene loads with an instructional dialog explaining what to listen for. Press A or X to dismiss the dialog, and press B or Y to return to the hub.
In the Directivity scene, three characters stand in a hallway with spatialized voice audio. Move around and rotate your head to notice how voices sound clearer when the speaker faces you and more muffled when they face away.
In the PointSource scene, a guitar plays in a small room. Walk toward and away from the guitar to hear volume change with distance. In the Ambisonic scene, ocean sounds surround you on a cliff. Rotate your head to notice the soundscape tracks your head rotation, but walking does not change the audio position.
In the RoomAcoustics scene, three speakers stand in a large hall. Listen for reflections off walls, floor, and ceiling creating realistic reverb — closer walls produce more prominent reflections. In the Volumetric scene, a helicopter flies an automated path. Notice the sound comes from the helicopter’s spatial volume, not a single point.
In the HRTFIntensity scene, two characters speak with different HRTF settings. The blue character sounds precisely localized to a point, while the red character sounds more spread out and diffuse.

Key concepts

Project-level audio configuration

Notice how the sample configures Meta XR Audio at the project level. In Edit > Project Settings > Audio, the Spatializer Plugin and Ambisonic Decoder Plugin are both set to “Meta XR Audio”. The MetaXRAudioSettings asset specifies a voice limit of 64 concurrent spatialized sources. This global configuration applies to all scenes.
For detailed configuration options, see the Meta XR Audio SDK Unity setup guide.

AudioSource spatialization toggle

The sample demonstrates when to enable versus disable Unity’s Spatialize checkbox. Point sources (Directivity, PointSource, RoomAcoustics, and HRTFIntensity scenes) have Spatialize = true on their AudioSource components. Ambisonic sources (Ambisonic scene) have Spatialize = false because the ambisonic decoder handles spatialization instead. Check the AudioSource inspector in each scene to see the pattern.

Room acoustics with mixer effects

The RoomAcoustics scene shows how to configure reflections and reverb using the audio mixer. The SpatializerMixer asset contains a Meta XR Audio Reflection effect on the Master group. Room dimensions (X, Y, Z) and per-wall reflection coefficients control early reflections. The sample includes two mixer snapshots with different room parameters — you can compare these presets in SpatializerMixer (Audio Mixer window) to see how room dimensions and wall reflection coefficients affect reverb.

UI audio feedback positioning

The sample implements spatial UI audio feedback through UIAudio.cs and CanvasAudioMapper.cs. When you hover or press a UI button, the system positions an AudioSource at the interaction point before playing:
// From UIAudio.cs
public void PlayPressSound(Vector3 position) {
    PressAudioSource.transform.position = position;
    PressAudioSource.Play();
}
View the complete implementation in Assets/Scripts/UIAudio.cs.

Ambisonic asset integration

The Ambisonic scene uses a first-order ambisonic audio file (SFX_OceanAmb_FirstOrder_01.wav) with the Meta XR Audio ambisonic decoder. The AudioSource has Spatialize = false and relies on the decoder plugin configured in Project Settings. This pattern allows pre-rendered spatial soundscapes to rotate with head movement without requiring multiple point sources.
For ambisonic format requirements, see the Meta XR Audio SDK Unity setup guide.

Extend the sample

  • Modify the RoomAcoustics scene to transition between the two mixer snapshots when the user crosses a threshold, demonstrating dynamic acoustic changes
  • Add a new scene demonstrating audio occlusion by placing AudioSources behind geometry with physics-based obstruction
  • Implement distance-based HRTF intensity in the HRTFIntensity scene so the effect changes as you move closer to or farther from the speakers