TL;DR: We’ve been exploring experimental techniques for comfortable locomotion in VR. This has involved research in a number of different directions - some are departures on existing ideas, some are new and unusual. To this end, we have shipped the source code for a sample application exploring some of these techniques with the Oculus SDK for Windows, under Samples (Experimental) → Locomotion,so you can see the ideas for themselves. Most illustrated techniques provide adjustable parameters, allowing you to experiment with key values, so you can assess these new avenues and explore how you might use them, or variants of them, to pursue more comfortable VR locomotion. In this post, we'll take a quick look at some of the approaches explored by the sample application. For additional research into VR locomotion check out Part 1 and Part 2 of the VR Locomotion Developer Perspective with Crytek. WHY RESEARCH LOCOMOTION
Our brains are programmed to react strongly to mismatches between what our eyes see and our vestibular sense feels. This can manifest as discomfort, even feeling off-balance, and varies from one individual to another. This can be problematic for VR apps that seek to allow the kinds of movement and locomotion that are commonplace in non-VR apps. Thus, we are interested in exploring techniques that mitigate or eliminate undesirable effects from linear acceleration or rotation perceived in a VR app.
PRESENTATION OF THE RESEARCH
When it comes to VR, there is a benefit in actually experiencing the subject matter. To that end, our research is presented to developers as a functional VR app, so devs may quickly and easily gain insight into the techniques, and ascertain what they might use in their products. Once experienced, the developer may also wish to immediately modify or augment what they see – again an efficient way of exploring the solution space of comfortable locomotion more deeply. In our application we provide the full source code to allow developers to step through the techniques, with explanatory notes and useful variations. The source code is laid out in an intuitive fashion to allow developers to easily make changes. The PC SDK ships with a number of samples, notably a suite of “Oculus Room Tiny” samples.
You can find this research within the ‘Samples (Experimental)’ section, entitled "Locomotion".
THE GENERAL THINKING BEHIND THE EXPERIMENTAL TECHNIQUES
With one exception, the general theory in VR locomotion is to avoid the conflict for the user by convincing the brain that the user is not in fact moving - that essentially their senses are correct. Sufficient visual cues need to be present to reinforce the sense's perception of the actual motion. These take a variety of forms.
Conversely, there is an additional technique seeks to convince your vestibular sense that motion is taking place by providing the visuals consistent with head-tilting.
It should be noted that many of the constructs are somewhat contrived, and might undermine immersion itself by their very inclusion - a decrease in immersion sometimes promotes a more comfortable experience. Additionally it might be helpful for developers to bear in mind that any ‘intrusive’ element introduced to provide comfort may gradually subside as the player becomes more accustomed, tolerant, or acclimatized - this would need to be the subject of further testing. Most techniques are very readily and naturally ‘phased-out’, and so these techniques may well serve as ‘training wheels’ that do not ultimately preclude the full immersion of the full VR experience. For brevity, this blog contains an introduction to the techniques.
EMERGING STATIC IN THE PERIPHERY
A head-locked, emerging scene that suggests no motion
This supplies the user with a graphical representation of a static VR world, one that has no motion other than the user's actual physical movement. This static world is revealed at the periphery, and then only during locomotion that would otherwise create a mismatch between visuals and senses. The user is able to register a confirmation of their senses, while simultaneously perceiving the full locomotion of the app. This is an enhancement to the fading-to-black of the periphery that has been seen in some applications. The reason that this is an improvement, is that the peripheral environment gives your brain evidence that your body is in a stable, non-moving world. When you fade to black, your brain has no reason to believe your body is in a stable environment (i.e., you're just seeing the world through an aperture), so the interpretation of what that black area represents is ambiguous. Seeing the stable environment disambiguates the stimulus and tells your brain that you're in a stable environment with this moving area in the middle of it.
WINDOW INTO THE MOVING WORLD
Leveraging a familiar setting, constant visuals of static world
This illustration continues with the theme of visually confirming the user’s lack of actual motion through the virtual space. The content of interest is displayed in the application in a suspended overlay or quad set against an apparently-static environment or background - in effect, one or more 'TVs'! This technique seeks to leverage the already familiar experience of watching moving action on a screen, with the 'actual' world visible behind. It seems to be particularly important to make the depth of the ‘in-screen’ world coincident with the screen so a real-world parallel is achieved, promoting the familiar comfort one would expect from such a setup.
PORTALS INTO THE STATIC WORLD
Planting constant visuals in the application’s 3D world itself
Many manifestations of static backgrounds exist outside of the VR world itself. In order to create a more cohesive and immersive representation, the static world is glimpsed only through portals that actually exist and are plotted in the application’s 3D world itself. Immersion is less affected, and now the application can now control just how intrusive or grounding the static world is. Furthermore, as the user moves through the world, the interface does, too. This concept is somewhat difficult to describe (I know, I’ve tried!) and I highly recommend trying it for yourself - then it will be immediately apparent how it works.
SKI-POLE WORLD MANIPULATION
Adjusting the physical perception of the 3D world to promote a sense of stasis
Here we again promote the illusion that you aren’t moving - this time, by reinforcing the sense that the world is moving under you, and you are not moving atop the world!! The user effectively ‘poles' or grapples through the world, forcing it to move beneath them, as if it were mobile. Similar effects are seen in products such as 'The Climb' or 'Lone Echo', which instead use hands to seemingly 'pull' the world under you. In our research, rotation of the world is also achieved by twisting the controllers.
PROCESSES REDUCING RELEVANCE OF MISMATCH
Whereby we encourage the brain to disregard, or diminish, its trusted link between eyes and ears
Another approach, admittedly in its infancy, is the concept of diminishing the link between sense of motion and visuals, such that the mismatch has less impact. If we (temporarily) supply scenarios and visuals such that the user experiences no conflict from locomotion, then we might expand the range of possible apps. We might even be able to associate an art style, or an app’s graphics, with a more established and enduring sense of comfort, even with subsequent locomotion, so that users feel less discomfort simply because they're used to certain visual cues correlating with a more comfortable experience. We have attempted some simple early experiments in desensitizing the player with chaotic-yet-hopefully-comfortable images, with the hope that the scenario is too unbelievable to yield discomfort. We also explore whether the extreme movement of counter images might enable the user to consider the motion less relevant, even non-existent.
UNREAL WORLD BEYOND STATIC COCKPIT
Attempting to make a foreground promote no motion
The presence of a static VR world is most effective when it is in the background, i.e., its depth is beyond that of the main VR visuals. This promotes our sense that the most distant graphics are those of the real world, with any foreground graphics being deemed to be mobile, and less important as a reference point for judging true movement. Hence, when the foreground is static, we are inclined to believe that the VR world is static, and we are moving relative to it. In the general case, the ‘static’ foreground is simply deemed to be moving with us. This experiment seeks to try to promote the foreground to reality, and thus a source of grounding, by reducing the real feeling of the distance. There are a number of ways of achieving this diminishing of reality - here we artificial create the wrong distortion for the moving VR world while applying the correct distortion for the foreground cockpit.
COUNTER OPTIC FLOW
Promoting static movement by visuals countering the movement of the 3D world
Another approach to eliminating the apparent locomotion is to provide equal and opposite visuals to the ones generated by the locomotion. Hence the VR world is moving, and a second version of that world is overlaid, and forced to move in the opposite direction. It is tinted another colour to avoid confusion over which world the user is operating in.
ARTIFICIAL TILT
Implying actual motion, rather than promoting a sense of stasis
And finally, we try for an all-encompassing solution that departs from any notion of stasis. This one embraces the locomotion perceived by the visuals, but seeks to artificially align a component of the force of gravity to that of the acceleration of the locomotion. It isn’t exact, but it’s a representation that may well be just enough to convince the brain that true acceleration is being experienced. It does, however, pretend that the head has realigned rotationally, which it hasn’t, and that unrepresented component lends a degree of discomfort. Another prime example of trying and seeing for yourself.
All of these examples are available with the
Oculus SDK for Windows, under
Samples (Experimental) → Locomotion. Try them out, and if you have any comments, let us know what you think on our
Developer Forum.
More research into locomotion is on the horizon!