Streamline Multi-user Setup with the Colocated Experiences MR Motif
With social, multi-user experiences on the rise and featured in many of the top games and apps in the Meta Horizon Store, getting people together seamlessly is more important than ever.
In mixed reality, one capability that powers many of these popular social experiences is colocation, which refers to games and apps that automatically detect and connect Meta Quest devices when users are in close physical proximity, typically within about 10 meters. With minimal setup, colocation is used to streamline multi-user experiences, making social and collaborative play more immersive and natural.
By using technologies such as the Bluetooth-based Colocation Discovery and Shared Spatial Anchors, developers can synchronize virtual content with the physical world and enable users to interact with the same digital objects in a shared physical space.
In this motif, we’ll walk you through key steps to support colocation, from the very basics of Spatial Anchors and sharing them over the network to setting up colocated experiences in no time using Colocation Discovery. You can also watch the accompanying tutorial video for a visual walkthrough and additional tips.
To expand your colocation use cases, you can visit the motif project on GitHub to find a guide with steps on how to set up the new Space Sharing API, which allows you to build colocated apps where all users can leverage detailed information about their physical surroundings with the Mixed Reality Utility Kit (MRUK). The project also contains our new hand tracking microgestures that enable users to interact with the shared whiteboard sample in the project by using their controllers and hands equally.
If you’re ready to start getting users together in local mixed reality seamlessly, keep reading below to learn key steps on setting up colocated experiences.
Colocation facilitates local multi-user mixed reality by synchronizing virtual and physical content between users in close proximity.
Spatial Anchors Basics
An anchor is a world-locked frame of reference that gives a position and orientation to a virtual object in the physical world. Apps can use one anchor per virtual object, or choose to have multiple virtual objects use the same anchor as long as those objects are within its coverage area of three meters.
The Anchors API offers several key features:
Persistence across sessions: The pose of an anchor can be persisted.
Discovery across sessions: Anchors can be discovered and reused.
NarrativeSharing with other users: Anchors can be shared synchronously and asynchronously.
The GitHub project provides the SpatialAnchorManager, SpatialAnchorStorage, and SpatialAnchorLoader classes to help with managing anchors. These classes are samples of how the Anchors API could be used to enable basic operations like creating, saving, loading and erasing anchors.
Create anchors
You can create an anchor by instantiating a prefab that contains the OVRSpatialAnchor component, or taking an existing object in the scene and adding the OVRSpatialAnchor component to it. This assigns a UUID and creates the anchor asynchronously.
// C/C++
var anchor = gameObject.AddComponent<OVRSpatialAnchor>();
while (!anchor.Created)
{
await Task.Yield();
}
The anchor contains a property called ‘Created’ to check if the creation is already successfully completed. As a good practice, we recommend waiting until the anchor is created before continuing.
The ColocatedExperiences project contains a static SpatialAnchorStorage class, which allows you to save the anchors to the Unity PlayerPrefs. You can call the SaveUuidToPlayerPrefs method of the sample to store the spatial anchor on the device.
This method adds the new UUID to the PlayerPrefs with a count. This way, you can easily keep track of multiple UUIDs and access them later.
This method adds the new UUID to the PlayerPrefs with a count. This way, you can easily keep track of multiple UUIDs and access them later.
Load anchors
Loading anchors can be achieved with the SpatialAnchorLoader class and involves three separate steps: Loading, Localizing, and Binding.
Anchors are initially unbound when loaded, meaning the anchor is not yet connected to its intended GameObject’s OVRSpatialAnchor component. The anchor must be bound to manage its lifecycle and to provide access to other features such as save and erase.
To load an anchor, you’ll need its UUID. Since UUIDs are stored in the PlayerPrefs, you can query a list of UUIDs from your static SpatialAnchorStorage class.
// C/C++
var uuids = SpatialAnchorStorage.LoadAllUuidsFromPlayerPrefs();
Next, you can call the LoadUnboundAnchorsAsync method. At this step, it is a best practice to wait for a successful result before localizing the anchor by calling LocalizeAsync. When you localize an anchor, it causes the system to determine the anchor’s pose in the world.
// C/C++
var unboundAnchors = new List<OVRSpatialAnchor.UnboundAnchor>();
await OVRSpatialAnchor.LoadUnboundAnchorsAsync(uuids, unboundAnchors);
foreach (var unboundAnchor in unboundAnchors)
{
if (await unboundAnchor.LocalizeAsync())
{
// Bind anchor here
}
}
After successful localization, you can bind the unbound anchor to its OVRSpatialAnchor component by calling the built-in BindTo method.
// C/C++
if (!unboundAnchor.TryGetPose(out var pose))
{
return;
}
var anchorObject = Instantiate(anchorPrefab.gameObject, pose.position, pose.rotation);
var spatialAnchor = anchorObject.GetComponent<OVRSpatialAnchor>();
unboundAnchor.BindTo(spatialAnchor);
Erase anchors
The OVRSpatialAnchor.EraseAnchorAsync method is used to erase a spatial anchor from persistent storage. After that, it is a best practice to destroy the anchor GameObject to stop tracking it in the runtime and therefore free up storage for new anchors.
Using these simple steps, you can now easily create, save, load and erase anchors. Next, we’ll cover the key colocation functions of sharing anchors and aligning users to their pose.
Creating Colocated Experiences
To set up colocation, you can use Colocation Discovery in v71 and above of the Meta XR SDK in Unity. This feature allows nearby users to discover each other via Bluetooth and belongs to the OVRColocationSession class. Typically, the advertising clients seek to act as the host for a multi-user experience, and the discovering clients are interested in joining a hosted experience.
Requirements for Colocation Discovery
To enable Colocation Discovery, the Colocation Session Support permission on the OVR Manager needs to be set to ‘Required’. Doing so will add the following permission to the Android Manifest file:
Colocation has few requirements, but still necessitates one of these three conditions to be met:
The user is a member of a verified developer organization.
The user is a test user from the developer organization owning the app.
The user is invited by a developer organization to a release channel (except for production).
For Shared Spatial Anchors, the Shared Spatial Anchor Support permission also needs to be set to ‘Required’. Furthermore, the device must be connected to the internet. Lastly, Enhanced Spatial Services must be enabled on the Meta Quest device. To turn on Enhanced Spatial Services, go to Settings > Privacy and Safety > Device Permissions, and select Enhanced Spatial Services. The app will detect when this setting is disabled and inform users to turn it on.
Understanding Group-Based Anchor Sharing and Loading
As of v71, Spatial Anchor sharing and loading is groups-based instead of user-based. This eliminates the need for users to be entitled to the app and prevents the need for developers to manage user IDs through their verified app on the Developer Dashboard. Instead, an arbitrary group UUID is used for sharing and loading anchors, making Group Sharing the recommended approach.
Before sharing a spatial anchor with a group, one of the participants, usually the host, must create a single UUID representing the group and communicate it to the others. This communication can be achieved either via an app-managed network connection, such as Unity Netcode or Photon Fusion, or via Colocation Discovery, which greatly reduces end-user friction around setting up colocated experiences.
The group ID is automatically generated as the result of advertising. In the code below, the host starts the advertisement, and after doing so successfully, you can read the group ID from the advertisement result.
// C/C++
var advertisementResult = await OVRColocationSession.StartAdvertisementAsync(null);
_groupId = advertisementResult.Value;
// Create and save anchor here
Instead of sending null in the StartAdvertisementAsync method, you can also send some session information in the form of a string, such as a session name, or simply as a message to other users. You can send a maximum of 1024 bytes of data.
After creating and saving your spatial anchor using the steps we covered above, you can call the group-based function for sharing anchors.
Colocation Discovery: Session discovery and anchor loading
After the host starts advertising the session and successfully shares the anchor (including the created group ID), all other users can discover the session. Below, you can see that the OnColocationSessionDiscovered event is subscribed to before starting the discovery with StartDiscoveryAsync.
Once the colocation session has been discovered, the OnColocationSessionDiscovered callback is activated. Next, you can read the group ID from the discovered session and use it to load the anchor.
As performed previously, you can now load your anchor; however, this time the function is called LoadUnboundSharedAnchorsAsync and takes your group ID as input.
// C/C++
var unboundAnchors = new List<OVRSpatialAnchor.UnboundAnchor>();
await OVRSpatialAnchor.LoadUnboundSharedAnchorsAsync(_groupId, unboundAnchors);
After successfully loading the anchor, remember to localize and bind it to an OVRSpatialAnchor component by following the steps we covered above.
Anchor alignment
Now, the users are ready to be aligned to the anchor’s pose. Alignment is a critical step for creating a truly colocated experience because it enables all users to have the same tracking space as the host. To achieve this, you can adjust the position and rotation of users’ Camera Rig to the anchor’s pose.
There you have it! These are all the steps you need to create a seamless colocated experience with just a few lines of code.
Using the Space Sharing API
After learning the basics of spatial anchors and reviewing slightly more advanced concepts such as anchor sharing and using Colocation Discovery, you can take things one step further with the Space Sharing API, available with Meta XR SDK v74 and above.
Hosted in the MRUK, the Space Sharing API enables room layouts to be shared seamlessly across clients while delivering a seamless solution for a popular colocation use case.
Shared Spaces enables users to experience mixed reality that adapts to a shared physical space
Requirements and limitations
Only the host needs to scan the room beforehand or at the start of the experience.
The APK must be uploaded to a release channel on the Developer Dashboard and all users or test users must be invited to that channel or be part of the organization.
It is not possible to share a space between two devices with the same account. Therefore, there are two ways to test space sharing during development:
Use your device and someone else’s devices and share a space between the two. Make sure the account on the other devices is entitled to use your app.
Create a test user and log in with that test user on a second device. In this case, also make sure the test user email is invited to your release channel and entitled to use the app.
Space Sharing setup
To streamline the setup process, you can combine the Space Sharing API with Colocation Discovery. To share a room, all the host has to do is to talk to the MRUK singleton instance, get a list of MRUK rooms and call the built-in ShareRoomsAsync method.
// C/C++
// For sharing multiple rooms
var rooms = MRUK.Instance.Rooms;
MRUK.Instance.ShareRoomsAsync(rooms, _groupId);
// For sharing a single (current) room
var room = MRUK.Instance.GetCurrentRoom();
room.ShareRoomAsync(_groupId);
Similarly, it is possible for the other users to load all rooms shared with the group ID and align themselves to the room’s floor world pose.
For more detailed code and information on how to get all room information, such as the floor world pose, check out the sample code in the SpaceSharingManager class of the Colocated Experiences MR Motif.
Additional Resources and Tips
Now with just a few lines of code, you have everything you need to invite your friends to your room and seamlessly create unique and engaging experiences using Colocation Discovery and the new Space Sharing API.
Visit our GitHub repo for additional samples that can help you get started with colocation, including a shared Whiteboard in the Colocation sample scene and a simple bouncing ball spawner in the Shared Space sample scene. To find more tips and a visual walkthrough on getting started with colocation, be sure to follow along with our tutorial video.
Let us know if you found this motif helpful and stay up to date with the latest news by following us on X and Facebook, and don’t forget to subscribe to our developer newsletter in your Developer Dashboard settings for a monthly rundown of what’s happening around the Meta Horizon ecosystem.
Code Sample
Multi-User
OpenXR
Quest
Unity
Did you find this page helpful?
Explore more
Meet the Winners of the $1M Meta Horizon Creator Competition: Mobile Genre Showdown
After receiving hundreds of world submissions across 13 countries and several world genres, the results are finally in. Meet the winners of the $1M Mobile Genre Showdown creator competition.
Meta Horizon: New Tools & Incentives for Creators + More Mobile Worlds to Explore
Discover new generative AI tools that are helping creators accelerate the ecosystem and check out the results of our $1 million Meta Horizon mobile creator competition.
Creator Spotlight: How PinataMJ Built Palmfall Point in 24 Hours with GenAI
Learn how longtime creator and Meta Horizon Creator Program member PinataMJ built out his island-themed world Palfall Point in just 24 hours by using GenAI tools in the Worlds desktop editor.