Create convincing shared activities in MR that encourage authentic, intuitive interactions with the Shared Activities in Mixed Reality motif. This project uses the Multiplayer Building Blocks to quickly and effortlessly set up a networked experience using the Networked Meta Avatars. This motif shows developers how to easily extend Building Blocks and build custom shared experiences, like chess and movie co-watching, on top of them.
ParrelSync (1.5.2)
(Optional) Creates and maintains multiple Unity editor instances of the same project for easier multiplayer testing.
This motif uses Photon Fusion, as the Player Voice Chat Building Block is only supported by Photon Fusion.
Note: The Building Blocks currently do not support the Avatars SDK v33 or later yet, so please stick to v31 for now. The issue is that the avatar is not moving with the OVR Rig
Caution: If you plan to use the OpenXR plugin, be aware that the Avatar SDK does not yet support it, therefore, please keep using the Oculus XR Plugin for this sample.
The concept of Shared Activities in MR
The animation below illustrates the Shared Activities MR Motif. In MR, each user wants to place their own local board or screen without causing misalignment for others. Instead of moving the anchor (e.g., a chessboard), the solution is to adjust remote avatars’ position and rotation. This allows each player to position the board as needed while keeping others correctly aligned.
For example, if Client B is one meter behind the board, facing 45 degrees away, Client A can move their board freely while still seeing Client B in the correct relative position and rotation.
Shared Activities sample scenes
Chess Sample
Movie Cowatching Sample
The chess sample scene updates chess piece positions and rotations, similar to what the AvatarMovementHandlerMotif does. The ChessBoardHandlerMotif assigns State Authority to the player moving a piece and updates networked positions and rotations relative to the board. It switches the chess pieces’ Rigidbody to physics for the State Authority and kinematic for others, ensuring all clients sync with the State Authority’s simulation.
Notably, it integrates Photon Fusion’s IStateAuthorityChanged interface, allowing the system to wait for State Authority transfer before a player moves a piece. The scene includes four spawn locations, which can be expanded by adding more and assigning the SpawnPointMotif class.
The movie cowatching logic in the MovieControlsHandlerMotif differs from the previous sample by synchronizing UI elements instead of transforms, such as button and toggle states. It uses Networked Properties like NetworkBools and NetworkedFloats to track slider values and toggle states.
The IStateAuthorityChanged interface ensures each action is executed by the correct player. Currently, four spawn locations are set in front of the chess board.
To build a multiplayer experience with Meta XR SDKs, there are really just a couple of steps to keep in mind. Make sure to go through each step.
After installing all requirements from above, set up a multiplayer scene with the following Building Blocks:
[BuildingBlock] Camera Rig: On the Camera Rig feel free to add any additional Building Blocks such as grab and ray interaction or hands and controller tracking. Or simply use the OVRCameraRig prefab located at Assets/MRMotifs/Shared Assets/Prefabs.
[BuildingBlock] Passthrough: Make sure the CenterEyeAnchor’s Background Type is set to Solid Color and set the Background Color’s alpha to 0.
[BuildingBlock] Network Manager: Contains the heart of any multiplayer session: Network Runner and Network Events. The Multiplayer Building Blocks use FusionBBEvents instead of direct Network Events. FusionBBEvents wraps Photon Fusion’s INetworkRunnerCallbacks and exposes them as static events, allowing multiple classes to subscribe without implementing the interface directly. This simplifies event handling, promotes modularity, and keeps network logic separate from other game systems for cleaner code.
[BuildingBlock] Platform Init: This component is responsible for initializing the Meta platform, fetching the access token, and checking if the user is entitled to access the platform. This is later necessary to load the avatar, based on the user’s account and if they are entitled.
[BuildingBlock] Auto Matchmaking: Here sits the component which is responsible for connecting the player to the right game mode (shared) and room. By default every client will simply be spawned in the same room.
[BuildingBlock] Networked Avatar: Responsible for spawning the local and remote Meta Avatars and syncing their movement over the network.
[BuildingBlock] Player Name Tag: Responsible for spawning a name tag above the avatar with the user’s name or, if the user is not logged in, a random name.
[BuildingBlock] Player Voice Chat: Responsible for setting up the voice chat and creating a speaker for each avatar.
A crucial part of a multiplayer set up, using Meta Avatars, is the entitlement check, which makes it possible to use all the amazing Meta Platform features and see your friend’s and your own avatars in the experience:
Create a developer account on the Meta Quest Dashboard, and either create or join an organization and create your app (Meta Horizon Store).
Retrieve its App ID by navigating to the Development > API section. Find the ID under “App ID - Used to initialize the Platform SDK”.
Add this App ID to the Unity project through the Unity Editor under Meta > Platform > Edit Settings, under Application ID.
Complete the Data Use Check Up for your app, which can be found under Requirements > Data Use Check Up in the Dashboard. For using the Meta Avatars, fill the usage for User ID, User Profile, and Avatars. Make sure that for all of them Use Avatars under Usage is selected and an arbitrary description, such as Using Meta Avatars in my multiplayer experience, under Description, is written.
To use other platform features such as the Friends Invite feature included in this MR Motif, then additional Data Use Check Ups have to be filled out, such as Deep Linking, Friends, and Invites. Do not forget in this case, to create one or more destinations first. Later, the API name and Deep Link Message are needed, in order to invite friends to the experience.
If everything was filled out correctly, the request should be accepted in a few minutes in most cases.
In the Unity Editor under the platform settings, check the Use Standalone Platform check box. Then go back to the Developer Dashboard. Under Development > Test Users, create a new Test User. This helps developers test the app with another account besides their own, which can be helpful for debugging. Make sure to at least fill in the prefixes and the password. Remember or note down the password. Now, in the Unity Editor under Meta > Platform > Edit Settings, under Unity Editor Settings, click on the check box for Use Standalone Platform and fill in the Test User credentials, and then log in.
Lastly, upload an APK of the app to a release channel, e.g. the Alpha channel. It is recommended to use the Meta Quest Developer Hub for this. Under App Distribution, select the organization and the application to upload the APK to. Make sure everyone testing the app is part of the organization or is invited as alpha tester. To be extra sure, the same APK version can also be installed additionally on the devices through the Device Manager in the Meta Quest Developer Hub.
After setting up the scene with the Multiplayer Building Blocks, and making sure that the entitlement check passes successfully, ebverything is rady to build truly shared experiences.
Multiplayer Troubleshooting Guide
Multiplayer development is not easy. However, following the setup guide should make sure that the app runs as expected. Below is a list of issues developers commonly face:
An avatar appears blue/purple: This indicates that the avatar asset could not be loaded. Make sure there are either avatar sample assets assigned in Unity or the user is entitled to use this app (see setup above). The issue could also be caused by the number of avatar presets the app is trying to load. On the Avatar Spawner Fusion class, located on the Networked Avatar Building Block, make sure the Preloaded Sample Avatar Size is 17 or less for the Avatar SDK v28 or higher, or 32 or less for older versions.
My avatar is not moving: Most likely the issue here is how the Building Blocks are trying to assign the input of the OVR Rig to the Avatars. Keep in mind that currently do not support the Avatars SDK v33 or later yet, so please stick to v31 for now.
I cannot see other avatars: This is a known issue. If this is happening in the editor, make sure to pan the camera around and move to where the other player is expected to be for its visuals to show up. On the headset, move around or simply wait a few seconds for the other clients to show up. If they still cannot be seen, it is most likely that the entitlement check failed on a client.
Can’t copy CoreAssets from package folder to Avatar2 Assets folder error: Simply delete the CoreAssets folders found in the Avatar2 folder under Assets > Oculus > Avatar2 and restart the editor for the package to correctly copy the right core assets into the project.
.zip assets missing error from the Meta Avatar SDK Sample Assets package: Copy the missing files from the packages (Meta Avatars SDK and Meta Avatars SDK Sample Assets) into the corresponding Assets folders, or reinstall the Meta Avatars SDK Sample Assets.
Missing Fusion namespaces: Close Unity, delete the Photon folder under Assets, open Unity and reimport Fusion 2 as well as Photon Voice.
Missing BuildingBlock namespaces, even though Meta Core SDK is installed: This means some of the scripts are throwing errors before the packages have had a chance to compile. This usually happens when adding and removing the Photon folder during the development process. It most likely means the build settings did not clear the Scripting Define Symbols in the Player Settings. In that case make sure there are no symbols such as FUSION_WEAVER, FUSION2, or PHOTON_VOICE_DEFINED, before Photon Fusion 2 is imported!
I cannot upload my APK through the Meta Quest Developer Hub: This can have various reasons, indicated on the error message. More often than not, the upload fails because of a missing or unnecessary permission in the Android Manifest. To make the Android manifest compatible with the store, an editor extension can be found under Meta, then Tools, and then click on Create store-compatible AndroidManifest.xml. However, if there are still unwanted Android permissions present, that cannot be removed either manually or with the editor extension, then it can be explicitly removed directly within the Android Manifest file. This is how a permission that is unexpectedly appearing might be removed. Simply add this line to the AndroidManifest.xml file in Unity:
How to move Avatars and other GameObjects in a Shared Mixed Reality Activity
In Shared Mixed Reality Activities, networking a single object (e.g., a chessboard) can cause misalignment when moved. Instead, each player maintains a local copy of the board and shares their avatar’s position relative to it. The State Authority manages networked arrays of positions and rotations, updated as players move their boards. This ensures all avatars stay correctly aligned by reading from these arrays each frame or as needed.
At the core of avatar movement is the AvatarEntity, which manages loading, configuration, and synchronization of Meta Avatars in multiplayer by streaming state data like position and rotation.
The AvatarBehaviourFusion class integrates this with Photon Fusion, networking avatar states across clients to keep remote avatars updated. It is attached to the FusionAvatar prefab, which is spawned into the Fusion scene by the AvatarSpawnerFusion class. This class handles spawning and entitlement checks, ensuring avatars are dynamically loaded and synchronized in real-time.
What components of the Avatars can and can’t be manipulated
Components That Can’t Be Manipulated:
AvatarEntity: Controls the avatar’s head and hands based on the main camera and body tracking via OVR Manager. Since this ensures natural movement, the remote avatar’s head or chest cannot be moved.
CenterEyeCamera: The main camera’s transform is controlled by the headset and cannot be set at runtime.
OVR Camera Rig: Moving this rig affects the local avatar. It is used in AvatarSpawnerHandlerMotif for initial positioning but is left untouched afterward, as direct access to remote players’ rigs is not possible, making updates difficult.
Components That Can Be Manipulated: The AvatarBehaviourFusion parent object can be moved safely without causing issues. In AvatarMovementHandlerMotif, this acts as the “remote version” of another player’s OVR Camera Rig. When a player moves, their position and rotation are sent to a networked list. Other players read from this list, ensuring each avatar appears in the correct relative position to their object while allowing personal placement of elements like a chessboard.
Since remote avatars are moved using networked arrays of positions and rotations, the Networked Transform component of Photon Fusion is no longer needed. In the AvatarMovementHandlerMotif class, this component is disabled for each remote avatar to prevent Fusion from applying additional offsets, which could misalign avatars from the local chess board.