Each year,
Oculus Launch Pad supports promising VR content creators from diverse backgrounds with hands-on training and support, so they can iterate on their unique ideas and bring them to market.
2019 Launch Pad grant recipients Jasmine Bulin, Dave Dorn, and Austin Krauss spoke with us about their involvement with Oculus Launch Pad and how it helped shape their career and the development of
Neverboard, an app that mixes the magic of social VR with the fun of a board game party.
If you’re interested in applying for our next Oculus Launch Pad session, watch for our open application announcement on the blog this Spring.
Congrats on receiving an Oculus Launch Pad grant! What was your Launch Pad experience like and how has your involvement made an impact on your career?
Jasmine: Launch Pad for me is all about the community. When our cohort came together I didn’t know what to expect or how I would be received. While I had a career in the games industry before Launch Pad, I had plenty of experiences in feeling a sense of otherness. What I found was other creators with similar challenges and diverse ideas. OLP fellows from all cohorts were welcoming and even after graduation I’ve been able to have one-on-one meetups to discuss wild ideas and shared problems. Having a community like this to support me was what gave me the confidence to pursue my vision for virtual reality. I was also really lucky to have found Dave and Austin who encouraged me to apply for OLP and every day give me the space to be my authentic self as a creator.
What are your top tips for devs hoping to be more inclusive and reach a broader audience?
Jasmine: Personally, I discourage developers from making a game with an audience of one (themselves) but also to never make a game where they have contempt for the people that play that game. You must have empathy for your audience so I recommend to creators:
Challenge your preconceived notions of who plays. Everyone plays something.
Cultivate your curiosity of strangers and their relationships with play.
Start your design process with accessibility in mind and play test early and often.
At Launch Pad we were lucky enough to get to experiment in design sessions with AbleGamers and discovered a wealth of techniques to incorporate accessible design into our experiences. It wasn’t about checking off some accessibility boxes but about how to include all types of players in design decisions that made sense for our game. This directly influenced our UX and UI, especially in the ways our game is generous with accepting player intent. We found that making the experience more accessible made it more fun for more people to play because there was less friction in the experience.
Could you tell us about the core gameplay and what inspired you to create Neverboard?
Jasmine: Everyone gravitates to Neverboard being board games in VR. You have a library of games to choose from like Crazy 8s and Chess and a whole lot more, you never have to set up or clean up, the rules are enforced so there is no reading a manual before you start, and beyond games there are things to share like hats to wear and pizzas to eat. It is inspired by our best memories of playing games together with our friends and families but it also came from this belief that culturally we are shifting how we socialize to revolve around games. I want that to be a reality for all types of players.
Were there any important themes or messages you wanted to get across in your game?
Jasmine: We hope that players see that Neverboard is more than board games—it is about social connection. It is more about how the overall experience makes being with friends fun than any specific game but everything we have added to the library and what we have on the roadmap all were carefully chosen to give players the script and mechanics to be fun for each other. Not just the games either, the space between the seats, the silly treats, the menus, the table, and some surprises you’ll have to play to find, were all designed and iterated through play testing for their social fun. Regular fun just wasn’t enough, it had to provide something special for players together.
Did you run into any major technical challenges? If so, how did you overcome those challenges? Feel free to be as specific and detailed as you’d like.
Austin: Developing a multiplayer game certainly comes with a plethora of challenges. Add in a mobile VR device with the Oculus Quest and a life-changing global pandemic, and you have enough problems to fill a book!
In the case of Neverboard, our goals were to create a "cozy" environment along with interactions and UI that felt playful and purposeful. Much of our early prototypes revolved around the initial game experience—how were players expected to interact with the world? What sort of UI furthered our goals of clear intent with playfulness?
When it came to exploring technical solutions around UI, we turned to Unreal's UMG and Widget components to render our 2D art on to our 3D world objects. We quickly found this technique to be subpar on Quest, especially as compared to other titles in the Oculus Store. Our art was marred with jagged edges and "dancing line" problems once deployed to the Quest.
After a bit of trial-and-error, we turned our attention to Unreal's StereoLayer component, and more specifically, the Oculus implementation found in the OculusHMD_Layer code. We did several more tests here with some extensive changes to this system—the results were promising with our 2D art now being rendered directly to the VR compositor and using "fake" depth testing to maintain a feeling of being anchored in the 3D world. We’ve made our solution public along with links to our UE4 GitHub repository.
What was the main inspiration for the art direction in Neverboard?
Jasmine: It all comes back to our obsession with giving players the best social experience. It was also important that we make this an approachable experience so we experimented with finding a balance between anchoring players to something familiar, the magical things that can only happen in VR, and the technical limitations of developing for mobile VR. The environment feels the most “real” while the art is more whimsical where players interact. This is a board game game after all so we had to have a rainy day and around the room are lots of board game details that are representative of all the people that made the game.
How important was sound and music to your game?
Dave: Sound and music is a core piece of the Neverboard experience because we want every board game to have its own identity and a lot of effort goes into ensuring sounds and audiovisual pairings have the right “intention” behind them.
Early on, we talked about how sonic gestures could serve as learning elements and our early testing confirmed that. We’ve found that good audio cues keep players engaged and amused—creating more opportunities for discovery and joy. But we also have to think hard about accessibility, so for example, we’re experimenting with speech-to-text options as an alternative to direct controller input for some games.
Of course, technical implementation is 50% of our audio pipeline. Because Neverboard is a seated experience in one location, things like occlusion and effects like reverbs & delays, are not as important. However, spatialization is extremely important in our game for understanding turn based actions in addition to distance and presence. Casting multiple sounds to several players depending on their move, is tricky, so using a soundcaster or simulating gameplay in the engine isn’t really an option for us. The test in real life defines for us what works and what doesn’t.
What did you learn from your experience playtesting the game?
Dave: The unanticipated amount of things play testers will break! In all seriousness, play testing was one of the hardest things to accomplish while working on a game during a global pandemic. We needed more than core gamers to play the experience. Since we could not just gather people together to put them through the experience and collect data, we had to design a safe plan and become experts in running remote VR play tests. Some of the key elements for us were:
We recruited in our network pairs of willing participants who fit our player demographic and grouped them into players who had and did not yet have a headset.
We developed a hardware inventory, cleaning procedures, as well as two-way shipping plans.
We had a host in VR for every play test session who recorded the qualitative experience, with a script to make sure we hit all the major test and safety points.
We had a post play test survey to gather quantitative data both about our players and each part of their experience, which came out to about 15 questions total.
We reviewed our play test plan after every test and updated as needed.
We thought we could do short play tests of a half hour where we just tested one part of the experience at a time but even with only testing small parts of the experience our play testers wanted to play it over and over again and give us more feedback! So we found that running 4-6 players through each 1.5 hour test and running each test about a month apart gave us the best results.
Each test included observer notes and the answers to the survey we created which often had a unique set of questions for the scope of the test. We used these to generate a report to show the results to our team and it allowed us to evaluate what was working and where we could improve.
We knew we had nailed the stickiness of the social experience when play testers were ending their experience asking for more and emailing us days after their test asking us if it was ok for them to play again with their friends outside of the play tests.
What advice would you give to a developer looking to start building for VR?
Austin: Get some sort of working test level or rough prototype up and running on the VR hardware as early as possible. Doing so will give you a feel for the build pipeline, allow you to establish some best practices and processes for team development, and start to expose you to the uniqueness and constraints of VR devices.
Iterating on the experience in the engine of your choice on a development machine is great - but it's no substitute for actually deploying to the VR hardware itself. You'll likely find the world scale, rendering, and timing of your code execution can have significantly more variance on the VR device as opposed to your development environment.
Jasmine: Don’t stop experimenting with simple ideas and pushing on what can be a great experience. Look at other apps, sketch things out in VR when you can, and be prepared to iterate.