At OC5, attendees got to try out prototype technologies—arena-scale tracking, mixed reality (MR), and co-location—under development at Oculus with the new Dead and Buried Arena demo.
Players teamed up with five others to participate in a shared game environment through co-location, where players share a physical space. They played in an environment where the real world can blend seamlessly with a virtual world through MR. And they did it on a massive 80x60-foot play area through arena-scale tracking.
If you couldn't make it to OC5, here's a bit more about how these technologies work in Dead and Buried Arena, these demos, and on some of our newest hardware.
Arena-Scale
Earlier today, we announced Oculus Quest, featuring our best-in-class inside-out tracking solution called OculusInsight. This technology works by detecting thousands of points of interest in the environment to deliver a greater sense of immersion, presence, and mobility. For the Dead and Buried Arena demo, we prototyped arena-scale technology on top of Oculus Insight, pushing inside-out tracking beyond current limitations.
We started by incorporating physical props for enhanced tracking and to create a physically compelling gameplay experience. We mapped the environment to ensure real obstacles aligned with their in-game counterparts and ensured each headset localized in the same map. Taking advantage of the ultra-wide sensors on Oculus Quest, we even optimized carpet design and hung physical clouds from the ceiling to provide additional tracking features.
Mixed Reality
In the Dead and Buried demo, we showcase an early application of MR to enable self-presence (seeing your body in virtual space) and spatial awareness (seeing your environment while in VR). As players bring the headset over their eyes, MR lets them see their environment and fellow players—no bumping into obstacles or each other—gradually “entering” the western-themed virtual world of Dead and Buried. Attendees will see their bodies in real-time, anchoring them to the fictional environment and making the experience more immersive.
>We leveraged the existing tracking cameras used for Oculus Insight. First, we ran a real-time stereo depth-matching algorithm on each camera image to extract lines and contours of the environment. We then re-construct these images from the eye position to make everything perceptually comfortable. All these complex algorithms run on-device and in real-time.
Co-location Multiplayer
Today, the only way for multiple players to share VR experiences is through the internet, where each person has personal space to move. This is great for connecting with friends at any distance.
But what if your friend is standing in your living room? Maybe you want to play together or watch a movie in shared virtual space. This introduces a unique set of challenges. A shared physical space means individual headsets need to share a common play area and need to be aware of each other's location in real-time.
Oculus Insight can build and store a “spatial map” of any environment. It can retrieve this map and use camera data to “see” where it’s located. In the current demo, we created a master map of the entire space and made the map accessible to devices in the same room over the network. This way, devices know where they are in relation to one another, allowing them to co-locate each other.
Asymmetric Co-location
Want to join the fun but don't have your own headset? Our prototype asymmetric co-location technology lets you grab your mobile device, walk around the play space, and see what your friends are doing in VR. This is a proof-of-concept example of how we could bridge the gap across different platforms and bring people together in a single immersive experience.
We leverage Oculus Insight and the shared spatial map used by each headset to localize a tablet into the same frame of reference. As the participant walks around, they’re treated to a 1-to-1 mapping of the tablet's movement in the shared space, so they can see the action in real-time, from their exact vantage point.
We’d like to reiterate these technologies are a proof of concept used to demonstrate the potential of headsets like Oculus Quest. We expect Location-Based-Entertainment developers will benefit from these technologies. More broadly, we see the developer ecosystem taking advantage of these user experiences to enhance social applications in VR. It's a future we believe and it's one we can't build without the support of creators and developers.
It takes a global team of computer vision and software experts across Oculus to build world-class tracking and reconstruction technology. A diverse group of professionals across Menlo Park, Seattle, Dallas, and Zurich worked to build and integrate these technologies into a compelling VR experience. And we’re just getting started. We hope you can join us in our journey!