Mixed Reality Utility Kit Features
Updated: Nov 15, 2024
You can access all the functionality provided by MR Utility Kit either via Blueprint or from C++.
The first thing you will want to do is load a scene. This can be done via an Async Blueprint node or the MRUKSubsystem
:
LoadSceneFromDeviceAsync
: This async node will load the Scene data stored on your device and is an asynchronous operation. The Success
or Failure
pins will be executed if the Scene data could be loaded or not.LoadSceneFromJsonString
: This function on MRUKSubsystem
will load the Scene from a JSON string which has previously been saved via SaveSceneToJsonString
. This is useful if you want to capture your scene and then iterate on it in the editor without needing to run on your device. This operation is synchronous and so there is no need to wait to chain on further operations however it will also trigger the OnSceneLoaded
event.
For reference, take a look at the level Blueprint in Demo.umap
, which first tries to load data from the device. If that fails, it falls back to loading a random room from JSON. The sample project has about 30 previously captured rooms in a Data Table where each row contains a different room.
Once the Scene is loaded, an instance of AMRUKRoom
will be spawned for each room in your scene. This is usually only one, but multi-room support is coming soon. You can access this by calling GetCurrentRoom
or referencing the Rooms
property from MRUKSubsystem
. For each anchor in your room an instance of AMRUKAnchor
will be spawned as a child of the room, these are accessible via AllAnchors
, WallAnchors
, FloorAnchor
and CeilingAnchor
. At this point nothing will be visible in your world yet, but you may query the actors to get information about the anchors, including their labels, position, plane/volume bounds, etc. In addition to this basic data, there are a number of methods that will help you reason about the scene and populate the room with renderable components. Below is a high level overview of these, for more details about the API please refer to the documentation in the source code or tooltips in the Blueprint nodes:
Raycast/RaycastAll()
: Raycast against anchors in the scene, this is implemented independently from the Unreal Engine helpers so does not interfere with your physics setup.GetBestPoseFromRaycast()
: Return a suggested pose from a raycast. Useful for placing AR content with a controller.IsPositionInRoom()
: Check if a position is within the room.IsPositionInSceneVolume()
: Check if a position is within any volumes in the room.TryGetClosestSurfacePosition()
: Get the position on the surface that is closest to the given position with respect to the distance.TryGetClosestSeatPose()
: Finds the closest seat given a ray.GetLargestSurface()
: Return the largest surface for a given label.GetKeyWall()
: Return the longest wall in the room that has no other walls behind it.GetAnchorsByLabel()
: Get a list of anchors by their label.RoomBounds
: World-aligned bounding box for macro functionality.ParentAnchor/ChildAnchors
: Uses heuristics to determine the child/parent relationship between anchors (for example a door would have a wall as a parent, a volume stacked on another will be its child).GenerateRandomPositionInRoom()
: Generate a uniform random position within the room.LoadGlobalMesh()
: This loads the global mesh triangle data from the device and attches it to global mesh anchor. It can be visualized and used for collision.
World locking is a feature of MRUK that makes it easier for developers to keep the virtual world in sync with the real world. Virtual content can be placed arbitrarily in the scene without needing to explicitly attach it to anchors, MRUK takes care of keeping the camera in sync with the real world using scene anchors under the hood. This is achieved by making small imperceptible adjustments to the camera rig’s tracking space transform optimizing it such that nearby scene anchors appear in the right place.
Previously with SceneActor, the recommended method to keep the real and virtual world in sync was to ensure that every piece of virtual content is attached to an anchor. This meant that nothing could be considered static and would need to cope with being moved by small amounts every frame. This can lead to a number of issues with networking, physics, rendering, etc. When world locking is enabled, virtual content can remain static. The space close to the player will remain in sync; however, the trade-off is that space further away may diverge from the physical world by a small amount.
World locking is enabled by default, but can be disabled by deselecting Enable World Lock in the project settings under Plugins > Mixed Reality Utility Kit.
These actors and components are designed to be dropped in your scene and used without extra code.
The AttachProceduralMeshToWalls
function creates procedural meshes for the walls, ensuring that the UV coordinates are seamless around the room. This function can be called from the AMRUKRoom
class. It takes as arguments a material which will be applied to the mesh and an array of WallTextureCoordinateModes
. A pair of UV coordinates will be created for each item in the array up to a maximum of 4. You may reference these UV coordinates in the material to create interesting effects.
The WallTextureCoordinateModes
consists of a mode for the U and V coordinates, which can be set independently from the following list:
Metric
: The texture coordinates start at 0 and increase by 1 unit every meter.MetricSeamless
: The texture coordinates start at 0 and increase by 1 unit every meter but are adjusted to end on a whole number to avoid seams.MaintainAspectRatio
: The texture coordinates are adjusted to the other dimensions to ensure the aspect ratio is maintained.MaintainAspectRatioSeamless
: The texture coordinates are adjusted to the other dimensions to ensure the aspect ratio is maintained but are adjusted to end on a whole number to avoid seams.Stretch
: The texture coordinates range from 0 to 1.
The Seamless
variants only apply to the U coordinate.
You may also call AttachProceduralMesh
directly on the AMRUKAnchor
components for more fine grained control over how the UV coordinates are defined and to create procedural meshes for other anchors types.
Used to spawn actors (for example, Blueprint classes) at the location of the anchors and scaled to fit the size of the volumes/planes. This can be used to spawn virtual representations of the objects in your room. If you’re coming from the Scene Actor workflow, you can use this instead as an easier and more flexible way to instantiate objects at anchor locations.
The Spawn Groups
property allows you to define what to spawn based on the Semantic Label (e.g., BED, CEILING, COUCH, etc.). For each label you can define an array of Actors
, when an anchor of the given label is encountered it will pick an actor for the list. There are 2 Selection Mode
s to choose from:
Random
: Will pick an actor at random from this list. You can specify an Anchor Random Spawn Seed
to get a deterministically random selection. This is useful for multiplayer scenarios where you want the same pseudo-random selection to occur on all clients or simply if you want to ensure the same selection is made between sessions.Closest Size
: The actor picked will be the one which most closely matches the size of the scene volume. Size is defined as the cube root of width * height * depth
.
If the Actors
array is empty then you have the option to Fallback to Procedural
both at the Label level and the global level. If Fallback to Procedural
is enabled then a procedural mesh will be created instead. A material may be assigned to the procedural mesh via the Procedural Material
property. If Fallback to Procedural
is disabled then nothing will spawn for the given anchor.
For each entry in the Actors
array you can specify:
Actor
: This is the actor to spawn, this can be a Blueprint class.Match Aspect Ratio
: When this is enabled, the orientation of the actor will best match the aspect ratio of the actor in the X/Y plane. Scene volumes don’t have a forward direction, so this can be useful for couches for example to make sure they don’t get overly distorted. It is recommended to enable this on actors which are long and thin volumes, keep it disabled for objects with an aspect ratio close to 1:1. Only applies to volumes.Calculate Facing Direction
: When calculate facing direction is enabled the actor will be rotated to face away from the closest wall. If match aspect ratio is also enabled then that will take precedence and it will be constrained to a choice between 2 directions only. Only applies to volumes.Scaling Mode
: This determines how the actor is scaled to fit the scene volume. By default the actor will be stretched to fit the size of the plane/volume. But in some cases this may not be desirable and can be customized here.
Stretch
: Stretch each axis to exactly match the size of the Plane/Volume.UniformScaling
: Scale each axis by the same amount to maintain the correct aspect ratio.UniformXYScale
: Scale the X and Y axes uniformly but the Z scale can be different.NoScaling
: Don’t perform any scaling.
The GenerateRandomPositionInRoom
function on AMRUKAnchor
provides a convenient way to find spawn positions for Actors. It will generate a random uniformly distributed position in the room, you can optionally specify how far away from the nearest surface it should be and if it should avoid generating points inside other scene anchor volumes or not.
The Debug Component is a convenient way to be able to visualize anchors in your app. It can be configured such that pointing and clicking on an Anchor will display its label(s), scale, collision point and a suggested Pose for placing MR content.
In order to use this, add the component to your Pawn class and hook up the inputs in BluePrint code. You will need to call ShowAnchorAtRayHit
or ShowAnchorSpaceAtRayHit
passing in an Origin and Direction which can be obtained from one of the motion controllers for example when a button is pressed. When the button is released call HideAnchor
or HideAnchorSpace
. An example of this setup can be found in BP_VRDemoPawn
event graph Blueprint in the sample app.
Creates a Guardian-like protective mesh that renders as a player approaches it, eventually showing Passthrough directly. This is helpful for safety if your scene is intended to be fully virtual instead of using Passthrough. The mesh is created using the GenerateProceduralAnchorMesh
function on the anchors. Refer to Guardian.umap
for an example of it being used.
A common alternative to real-time shadows are blob shadows. They are simple blots of color that only take the general shape of the actor into account and are more performant. They can be attached to any actor to create a blob shadow below it. This works with Passthrough as well.
The data of the point lights in the scene are sent to MPC_Highlights
via MRUKLightDispatcher
. Each point light is represented by 3 vectors: one for the light’s position, one for the light’s parameters (AttenuationRadius
, LightBrightness
, LightFalloffExponent
, and UseInverseSquaredFallOff0
) and one for the light’s color. Finally, the amount of lights is sent in the scalar parameter TotalLights
.
AMRUKDistanceMapGenerator Generates a distance map. A distance map is a texture that allows to obtain distances to scene objects in for example materials for various visual effects. The distance map works by defining two zones in the room: Zone A (inside the room’s free space) and Zone B (outside of the room or inside scene volumes). The distance map generator then creates a texture where each pixel’s red and green components represent the coordinates of the closest pixel from zone A to Zone B. Likewise, the blue and alpha components represent the coordinates of the closest pixel from Zone B to Zone A. For convenience, a material a material function called MF_DistanceMap is provided that takes as input the distance map and calculates for each pixel in the final image the distance to the opposite zone. Pixels that are in Zone A will be positive whereas pixels in Zone B will be negative. Take a look at the material M_Rainbow which gets used in DistanceMapSample to understand how the distance map can be used in materials.
AMRUKDestructibleMeshSpawner The Destructible Global Mesh Spawner is a feature designed to automatically generate destructible global meshes when new rooms are created. A destructible global mesh is a version of the global mesh that can be broken down during runtime. The core functionality is managed by the UDestructibleMeshComponent. Upon initialization, this component performs a one-time preprocessing step that segments the global mesh into smaller chunks. These chunks can then be manipulated during gameplay, allowing for dynamic interactions such as removal through ray casts, which simulates the effect of the global mesh breaking apart. To enhance the visual quality during the destruction process, such as when mesh chunks are removed, a particle system can be utilized to create realistic effects. Additionally, the system provides the capability to define specific areas within the mesh that should remain non-destructible. This can be useful to for example keep the floor indestructible.