The new locomotion sample in the Unity Sample Framework provides a framework for moving an avatar through an environment, with support for linear movement, teleports and related input, and visual effects. An extendable set of input, aim, targeting, orientation, transition and movement components work together to provide a broad set of locomotion configurations. These components provide logic for each of the stages of the teleport sequence, including target selection, landing orientation, and teleport effects. The sample includes a new map for testing movement in many common situations such as doorways, tunnels, steps, slopes and ceiling interactions.
The system supports both linear movement and teleportation, which can work independently or together. For the Rift, where positional tracking is available for the HMD, supporting some amount of linear movement is required even if the application primarily uses teleportation for moving through the environment. This is because a person can move their body independently of the teleports by simply moving the HMD. For applications targeting mobile devices, a system that uses teleports may not need to support linear movement at all due to the lack of positional tracking on these devices.
One goal for this sample is to demonstrate an approach for supporting a variety of locomotion behaviors to better support different designs and player preferences. To this end, two control panels are included in the sample: Locomotion Presets and Teleport Tuning. The Locomotion Presets panel is the easiest way to test typical locomotion configurations and allows you to switch between different presets at runtime without changing code. To access the Locomotion Presets menu, start the sample and press the menu button on the left Touch controller.
Selecting a preset will result in enabling some of the teleport framework components, disabling others, and adjusting various component values as needed. It is important to mention that changes to the lower level tuning values in the Teleport Tuning panel can affect the preset behaviors. It may be necessary to restart the sample after tweaking the lower level values for the presets to behave as intended.
The Teleport Tuning panel provides a more detailed list of options that can be useful for experimenting with the framework.
The options shown in the Teleport Tuning panel correspond to different options on the components attached to the LocomotionController object. The following is a quick summary of what these options do. Keep in mind all these options can also be set by using the Inspector panel for the components on PlayerController/LocomotionController; the convenience here is that this can be done without taking off the HMD and ensures that only one component of each type is active at a time.
- Input Handler section - Controls which TeleportInputHandler component is active.
- Avatar Touch - Use the avatar hands for teleport aiming and touch buttons for control. This is the likely choice when Oculus Touch controllers are available.
- Generic HMD - Use the forward vector of the HMD for teleport aiming and generic buttons for control. This provides functional input when using anything other than the Oculus Touch controllers.
- Aim Handler section - Controls which TeleportAimHandler component is active.
- Laser - This component uses a straight line for aiming.
- Parabolic - This component provides a parabolic arc for aiming. The arc shape can be tuned in the inspector.
- Target Handler section - Controls which TeleportTargetHandler component is active.
- Nav Mesh - Targets are restricted to areas that can be reached via NavMesh. This allows the application to limit access to different areas using the same NavMesh logic that NPCs use for their navigation.
- Teleport Node - Targets are restricted to game objects that contain a TeleportPoint component. In order to be detected by the targeting system, the layer must match the Teleport Layer mask set in the target handler component.
- Geometry - Targets anything that is detected by the collision system as limited by the Aim Collision Layer Mask.
- Orientation Handler section - Controls which TeleportOrientationHandler component is active.
- 360 - When teleporting, do not adjust orientation of the player at all. The aiming system will not render a direction indicator when aiming a teleport.
- HMD - When teleporting, use the HMD to aim at the ground near the destination to define the landing orientation. This is not a recommended mode, however if the application does not require a gamepad or Oculus Touch controllers this may be the only practical option.
- Thumbstick Head Relative - When aiming the teleport, a thumbstick will control the direction of the landing using Head Relative mode, which means the arrow points to where the camera will face relative to the current HMD orientation.
- Thumbstick Forward Facing - When aiming the teleport, a thumbstick will control the direction of the landing using Forward Facing mode, which means the arrow points to the forward direction as determined by the Oculus sensor setup. This allows you to point at and then teleport to a landmark, and when the player orients to the landmark they will be facing in the direction they were facing when they told the Oculus Services to reset the view.
- Transition section - Controls which TeleportTransition component is active.
- Instant - There is no effect during the teleport; the player instantly appears at the destination.
- Blink - The screen will fade to black, the teleport then occurs, and then the screen fades back up. The timing of this can be adjusted in the inspector.
- Warp (no orientation) - The player will translate over a period of time to the destination. The default time is very short, short enough to avoid discomfort for many people. Orientation is disabled for this because it is extremely uncomfortable to rotate during this movement and is strongly discouraged.
- Enable Linear Motion During Teleport States and Enable Rotation During Teleport States - Toggles for controlling when, or if, the player is able to move or rotate using the active input device. The purpose of this is to support different movement behaviors that we have seen in various applications. Some titles limit movement entirely to teleportation, some allow linear movement to occur during aiming, etc. These options allow linear movement to be enabled or disabled during any of the states the Teleport System. This is likely to be of use in situations where controls need to serve dual purposes. For instance, if the thumbsticks are sometimes used for controlling a context-specific feature or user interface it would be useful to disable movement during this time.
After exploring all the presets and the tuning panel, it can be useful to observe how these panel actions affect the teleport system components. Use the Unity Inspector window to inspect PlayerController/LocomotionController game object and pay close attention to which components are enabled and what values change when selecting new presets. Most of the options have more information in their tooltips. The Locomotion Presets panel is created at runtime, and the logic that configures each preset can be found in LocomotionPresetsPanel.cs
which may be extended with new preset logic specific to your application if desired. If the existing framework components don’t provide the necessary functionality, it should be fairly straightforward to modify one of the existing components while retaining the bulk of the functionality provided by the framework.
Teleport System State Machine
The OVRLocomotionTeleport
component is a core piece of the teleport system framework. Nearly all teleport related components interact with it to respond to events and access shared functionality. It also manages the teleport state machine, which is a specialized state machine tuned for the specific needs of handling the sequence of teleporting a person through the environment.
The diagram below illustrates the various teleport system states, followed by the states each one can transition to when the necessary triggers occur:
Each teleport state specifies if linear movement is enabled while the state is active and the application configures these as needed by the design. For instance, player controlled movement is usually disabled during the Teleporting state to ensure the player doesn’t move during the transition and will appear at the correct location when the teleport completes.
State Machine Walkthrough
The teleport system will change from Ready to Aim states when the correct input event occurs. When the Aim state activates, the Target Handler will begin using data provided by the Aim Handler to determine where the user intends to teleport. The sample includes variations of Target Handlers to allow the application to limit teleport targets to physical locations defined by collision bounds, navigation meshes or game objects configured with TeleportPoint
components. Teleport targeting can also verify the player can actually fit at the targeted location.
The sample includes two types of Aim Handlers that allow the application to choose between a line of sight “laser” and a parabolic target selection. Because these are modular components, it is possible to create new Aim Handlers or any of the other components and make them behave in ways that meet the needs of the design. For instance, it would be possible to make an Aim Handler that used the position of a thrown object so players would always teleport to wherever they happened to throw it. Another possibility is an Aim Handler that always selects the next node in a fixed sequence of nodes to result in a carefully scripted progression.
While the teleport system is in the Aim state, the user needs to see where they are aiming. The Aim Visual (OVRTeleportAimVisual
) object provides this functionality, which is responsible for rendering all the visual and sound feedback related to pointing at a target. The Aim Visual monitors the state of the Target Handler in order to indicate when the target is valid and change the effects accordingly. In the sample, it simply changes the color of the targeting beam.
When a teleport target is available, the teleport system activates a prefab with an OVRTeleportDestination
component to indicate where the target is and if the destination is valid. For targeting types that support arbitrary locations, the system will update a single teleport destination prefab and reposition it to show the current target as the player scans across the environment. When using teleport point targeting, the system will activate teleport destination prefabs when the player targets a node. When a teleport destination is no longer required because the target has become invalid, deselected or the teleport has occurred, the teleport destination will be deactivated, however it is important to understand that the teleport destination component controls when it will actually deactivate. This allows teleport destinations to trigger animations specific to the deactivation event, and it is then responsible for recycling the game object when the effects have finished. To support this behavior efficiently, the OVRLocomotionTeleport component provides object-recycling virtual methods that the application should override to use an object pooling system because the default implementation simply creates new instances as needed.
The Orientation Handler is responsible for choosing the post-teleport landing orientation and providing this data to the teleport destination prior to the teleport so the user can see where they will be facing after the teleport. Some options include aligning to a thumbstick direction, to a location indicated by the HMD, or no orientation adjustment at all. The Orientation Handler can operate during both the Aim and Pre-Teleport states to allow for the most control during teleports, however, if the design requires choosing landing orientation after selecting the target location, it is possible to restrict the orientation logic to the Pre-Teleport state.
If the user cancels the pending teleport while in the Aim or Pre-Teleport state, the teleport system will switch to the Cancel Aim state or Cancel Teleport state. They both serve the same purpose, which is to provide an opportunity for cleaning up any side effects of the Aim or Pre-Teleport states as well as triggering feedback to the user such as a “cancel” type of sound or visual effect.
When the teleport activates, the teleport system will enter the Teleporting state, which can trigger visual effects related to the transition such as fading the screen or quickly warping the camera to the destination. Linear motion is usually disabled during this state so that when the teleport completes the camera will be at the intended location regardless of player input during whatever time has elapsed. The active OVRTeleportTransition
is responsible for controlling how long the system will remain in the Teleporting state.
When the transition completes, the system will enter the Post-Teleport state. The purpose of this state is to trigger any side effects of the teleport that may be necessary, such as scripted events and visual effects. If the application supports linear movement, this is when it will usually be re-enabled. Typically, the teleport system will quickly leave the Post-Teleport state and return to the Ready state.
Teleport Framework Components
The LocomotionTeleport
class is responsible for managing the teleport system components, state machine and related settings. It is best described as an event-driven system which implements the state machine with coroutines. The various teleport support components attach handlers to the events they care about. This is a core design feature of the system that allows different teleport behaviors to be activated at any time based on design needs or user preferences.
The diagram below shows the relationships between the teleport support components. If your application needs behavior that is not supported by these existing components, you may find it useful to create a new component that fits within this graph somewhere.
Character Controller and Camera Improvements
The OVRPlayerController
now supports moving the character controller to match the position of the HMD, which substantially improves the accuracy of the physical simulation of the player moving through the world. Without position matching, the character controller capsule will usually not match the camera's position, which causes problems for any application that requires the character to collide with world geometry. For example, a person might think they are about to walk across a narrow bridge because their camera is in the correct place but, in fact, the character controller might be offset far enough that the invisible character controller instead walks right off the edge. This problem doesn't occur when the character controller moves in response to the movement of the HMD.
Since the character controller will now match the HMD’s position, this creates a new problem: what happens when the character attempts to move into a wall because the player moved the HMD into virtual wall? The answer lies in the fact that it is only attempting to match positions, and in this case it would fail to reach the goal and the character controller would no longer match the HMD position. The system can detect this disparity and the OVRCharacterCameraConstraint
component that is attached to the OVRPlayerController
will determine how far off from the goal position the character is and will fade the display using rules that will cause the scene to be hidden before the camera will clip the world geometry. The result is that the camera will simply fade out when the player puts the HMD through a virtual wall. The character controller will remain at the last valid position, and the controls will operate as normal to allow the player to back out of the geometry. The character camera constraint also supports a Camera Collision mode, which will prevent the camera from ever penetrating colliding geometry. This is not the recommended behavior because most people find this approach less comfortable than leaving the camera under the control of the user and fading out when collisions occur, however there may be situations that benefit from this mode.
Another optional feature is the dynamic adjustment of the height of the character controller based on the height of the HMD from the physical ground. This enables designs that require the player to duck in order to move below low ceilings. If the player stands up while geometry is above them, the system will detect this and will not increase the height of the character controller. It will fade out the world geometry or prevent camera movement the same way as when the user moves the HMD into other geometry. The character controller will not grow taller than the provided space, which allows the player to continue to move freely. Importantly, the character controller will not grow into a state where it overlaps the world geometry. Allowing that to happen would cause instability in the physics system while at the same time preventing the player from moving as intended. You can experiment with the camera constraint modes at runtime in the Camera Constraints menu panel:
Input Handlers and Platforms
While this sample is focused on the Rift with Touch controllers, it should be possible to support Gear VR and Oculus Go controllers in the future because most (if not all) logic specific to each platform will be handled by a component derived from OVRTeleportInputHandler
. The sample primarily uses OVRTeleportInputHandlerAvatarTouch
for input because most people today interact with the Oculus Rift using the Touch controllers. The OVRTeleportInputHandlerHMD
component provides support for gamepad and trackpad-based input schemes by using the HMD to provide aim data and mapping input to standard buttons.
Integrating the Teleport Framework
We hope that this framework will provide a useful starting point for developers who need a teleport system for their project. For developers already using the Oculus Unity Sample Framework, it should be a straightforward process to migrate the PlayerController hierarchy found in the TeleportAvatar scene into their own project in order to gain all the functionality provided by this framework. User preferences for locomotion can be implemented by adjusting component settings of a single copy of the LocomotionController that exists within this sample, or by duplicating the LocomotionController once for each locomotion behavior the user will be able to choose from. With the exception of the placeholder art and effects, the provided framework will meet the locomotion needs of many projects. We look forward to seeing your feedback!