We hosted our first ever Developer Summit at GDC and spent the day with developers, discussing everything they needed to know about our products and tools. It was an exciting opportunity to be able to offer deeper dives and a wide range of sessions. We know that not everyone has the opportunity to attend GDC, so we’re sharing some highlights for everyone in the Meta Quest for Developers community.
Build Next Generation XR Experiences with Meta Quest and Presence Platform
We kicked off the day with our Presence Platform team’s Rangaprabhu Parthasarathy, Mike Lamprinos and Giray Ozil taking attendees from project setup through to a simple XR app in less than one hour! This crash-course, which moved from concept understanding to practical implementation, covered important Presence Platform capabilities and SDKs that enable you to create truly immersive experiences. If you’re not familiar with Presence Platform, start
here.
We look forward to sharing the recording of this session with you all soon!
Panel: Building a Business on Meta Quest
Our very own Kimberly Unger moderated a conversation between Nanea Reeves of Tripp Inc., Brett Taylor of My Dog Zorro, Paul Brady of Resolution Games, and Emre Tanirgan of Paradiddle about their experiences publishing and managing their titles on Meta Quest. Each developer spoke about the strategies they use to build an audience and the indicators of success that they value most with an eye towards what effect including MR has had on the driving engagement for their titles.
Topics ranged from understanding which stat developers view as their “canary in the coal mine” on success through how they handle roadmap planning for new tech features they might want to incorporate. We offer tools to support your business across the app lifecycle including: tools to kickstart your marketing, build hype and community, launch, and grow your audience. Test and learn with our A/B testing tool, and offer trials of your app with ‘Try Before You Buy’ to drive conversions. Learn more about Meta tools to support building a business on Meta Quest and grow and engage your audience,
here.
Mixing Realities in Meta’s First Encounters
When building a mixed reality game, harmoniously blending two worlds into a single space is a fundamental challenge. Designing a game that incorporates the player’s room as part of the level means losing out on a lot of control that fully pre-authored level design grants developers. Game designer Zachary Dawson and game programmer Alexander Dawson discussed the design, technical challenges, and solutions developed for turning every player’s room into their own game world.
As a pack-in launch title for the Quest 3, Meta’s
First Encounters introduced players to little creatures called Puffians, which invade the player’s room, smash through their walls, and open up a virtual world. For the game to feel seamless and engaging, it was very important for any new virtual elements to become a part of the player’s physical world.
The team gained so many insights as they worked on First Encounters, and we hope that sharing some of these will help you as you develop your own games in mixed reality. We believe that MR has the potential to redefine the player’s experience, with more direct interactions and a level of physicality that we’ve never seen before. Here are some top tips the team uncovered:
- Design for the physical world: it’s important to remember that the player is constrained by physics and different players may move at different speeds. Augments should coexist with physical objects, and making them feel grounded goes a long way in suspending a player’s disbelief.
- Test in many rooms: every room is different, so it’s important not to make any assumptions or generalizations about space in order to adapt to your player’s space.
- Prototype, prototype, prototype: while this isn’t news to any game developer, it's even more crucial in MR. As mixed reality is new, many previous experiences in building for VR and 2D games may not carry over. We encourage you to play your game in-headset early and often, and be flexible as you work to find the fun.
Building for Immersion: Haptics and Interactions with Presence Platform
Andrew Lazdins, a Haptics Designer, and David Nelson, Software Engineer, shared valuable insights on how to best develop XR interactions and haptics. We covered the latest updates that make it easier to integrate into your apps and games with
Interaction SDK and
Meta Haptics Studio.
In the first half of the session, we focused on covering all of the features and recent updates to Interaction SDK including new features like multimodal, capsense hands, and wide motion mode as well as usability improvements such as quick actions in Unity and the comprehensive sample. For more details, check out our v62 announcement
here.
Then we delved into haptics design. Haptic feedback simulates the sense of touch to communicate with users and deliver more immersive experiences and can improve usability for interactions. The process of making haptics can be time-consuming, code-heavy and difficult to design and integrate. Meta Haptics Studio and Haptics SDK allows you to design your haptics by importing your audio effects, allowing the ‘audio to haptic analysis’ to automatically extract the haptic signal. Haptics Studio allows you to ‘audition’ your haptic clips through the headset to test and validate your designs. Best part, they are hardware agnostic. Integrate your files into
Unity or
Unreal with Haptics SDK.
We detailed best in class haptics examples from various titles. Stay tuned for the session video to see them all. In the meantime let’s talk about Asgard's Wrath 2 which is developed by Sanzaru Games and published by Oculus Studios.The team focused on adding immersion and realismbringing the player closer to the action with haptics. The introduction battle with Heru is a great example of how haptics can be used for cinematics out of the box. The player feels Heru crash onto the ice. Haptics add depth to character interactions such as jumping over objects, squeezing through crevices and performing hill slides where the audio parameters are driving the haptics as you steer and change speed all contribute to immersion.
When thinking about haptics for your experiences, we suggest applying the following framework:
Haptic Balance
- Less is more design approach
- Keep the core interaction designs simple and effective
Identify Users Needs
- Design with existing players in mind by keeping a familiar experience
- Must not interfere with fast-paced gameplay
Showcase Moments
- Designing effects for ‘Showcase’ moments that immerse the player such as arcs, obstacles and bomb examples
- Improve the UI experience - first point of call for players to experience HD haptics
We can’t wait to share the recording of this session with you soon!
Optimizing for Gameplay with Body Tracking on Meta Quest
Body tracking on Meta Quest 3 can elevate your gameplay and unlock new player experiences, but have you ever tried to apply body tracking to a character and it just didn’t look right? Maybe the mesh tore exposing the underlying skin or perhaps the character looked okay with basic movements, but it couldn’t interact well with other characters or even clap its hands. Kirk Barker, Product Manager for
Presence Platform’s Movement SDK walked through valuable insights and different solutions on how to best integrate this capability.
Movement SDK provides eye tracking, face tracking, and body tracking to enable games to embody this motion in characters for social apps, fitness, or games. These tracking services are exported over OpenXR interfaces with bindings and samples are provided in both
Unity and
Unreal.
A challenge in gameplay is varying sizes and proportions of characters and avatars. So you might have a very large person mapped to a very small character, or the opposite. Further, in addition to being different sizes, they may be very different body proportions. Your character might have very long arms or very short arms. Their torso might be tall compared to their legs. They might have enormous heads, or small tiny heads. The following are some design considerations when building for body tracking in games:
- What size do you want your characters in their environment? The closer the actual avatar is to the size of the human wearing the headset, the easier the challenge will be.
- What proportions of your character are most important? In certain cases, you will need to make a decision because the difference between human proportions and avatar proportions are too different.
- How accurate are the body components in the scene? For instance, depending on the scene, you might really care about foot or head placement if you are playing soccer, but not care as much about hand placement. You might really care about hip placement if you were doing a dance application, for example.
Meta Quest for Game Developers
David Borel, Neel Bedekar, and Alexandre Thivierge on Meta’s Developer Technology team gave an overview of the
Quest build and optimize flow, geared towards console and PC developers, with an eye toward graphically-demanding, immersive games. Mixed reality enables new creative possibilities, but it puts unique demands on programmers, artists, and designers.
While Quest 3 has a GPU to rival a Rift min-spec machine, it’s untethered and low-profile, making it comfortable to wear, even for vigorous exercise or play. Many games can render at 120FPS and the system is full of optimizations to make it more immersive, comfortable, and responsive, contributing to the feeling of authentic presence.
Building for Meta Quest however does present challenges for developers to adjust to: donning and doffing headset (VR requires you to re-situation yourself each time while iterating), fitting in system resources, and navigating unmapped design space as the first mover in your space with our new features. We’ve developed resources to mitigate some of the challenges of developing for XR and to help you author your project faster, like
XR Simulator to easily test your VR/MR app without donning and doffing your headset. With XR Simulator it is easy to test mixed reality. With the synthetic environments you can load maps that simulate presence features like scene, passthrough, Spatial Anchors and Shared Spatial anchors.
To support the performance optimization flow and help you figure out which CPU or GPU is the bottleneck of your application we offer
Performance monitoring and profiling tools. We’ve also created samples like
Phanto and
Cryptic Cabinet to offer ways to explore these features and discover the potential of mixed reality on Meta Quest.
Cryptic Cabinet is a new open source mixed reality showcase built in Unity that demonstrates how Presence Platform features can turn any room into a unique gameplay experience. Cryptic Cabinet demonstrates how to integrate various MR features. The full source code is now available on
GitHub.Maximum Fun: How to Build More Playful Experiences in Meta Horizon Worlds
To end our day, we discussed the advantages, challenges, trends and strategies for building immersive experiences in Meta
Horizon Worlds that spark joy, and keep users coming back for more. We covered some important insights from developers and Meta producers: Averie Timm Chávez, and Ginger Larsen from Meta, and Caio Fernandes (Kluge Interactive), and Sol Rogers (Magnopus). The conversation touched on some of the biggest changes the teams have seen in Horizon Worlds over the past year, highlighting the progress the team has made in bringing advanced tools to the platform that builders of this level really respond to. This included Horizon World's expansion to mobile and desktop, truly expanding the reach of who will be able to experience what Developers build, and how this is making it even more interesting for brands to be a part of these worlds.
And there’s even more to come!
We had a jam packed day at GDC and we’re excited to showcase even more innovations for Meta Quest over the next few days. We’ll be talking with our Meta experts and the developers of some of Meta Quest’s most popular titles as we showcase what’s new across our products and tools. We’ll be recapping all of our sessions at GDC here for the Meta Quest for Developers community, so stay tuned for even more tips, tricks, and best practices.