Developers know there’s always more than one way to tackle a problem—and sometimes this freedom can create more questions than answers. When it comes to building features that cater to varying degrees of abilities, the ambiguity can be challenging. But it can also open the door to breakthrough solutions and accessibility innovation that help raise the bar for the VR ecosystem.
That’s why
Owlchemy Labs’ recent Vision Accessibility Update for
Cosmonious Highis worth getting hyped about. Working with the goal of providing a better in-game experience for people with low vision, the Owlchemy Labs team added accessibility features that let players receive game information through audio descriptions using
Text To Speech (TTS). The game’s Vision Assistance Mode also enables high contrast and object-highlighting outlines for players with vision needs. Together, these features provide a gold standard for any developer looking to make their apps more accessible and enjoyable for a wider audience.
We caught up with Accessibility Product Manager Jazmin Cano and Senior Accessibility Engineer Peter Galbraith to learn more about Cosmonious High’s recent update and how their team approached implementing a solution for low vision in VR.
Why do you feel that accessibility is important, both for VR and overall game and app development?
Peter Galbraith: Accessibility is something that benefits everyone. Many people still view accessibility as something that’s only for people with permanent disabilities, but the truth is that a large percentage of players use one or more accessibility features due to having temporary or situational impairments or because they simply enjoy the convenience and comfort.
A player’s backpack opened to the Vision Accessibility section in Cosmonious High.
VR seems like a medium with an inherently high bar required for vision capabilities, so this update feels especially important. Given the previous milestones your team has achieved surrounding subtitles, physical accessibility, and height accessibility, what inspired your team to start working on the Vision Accessibility Update for Cosmonious High?
Jazmin Cano: Though VR does seem to be a highly visual medium, gamers who are blind or have low vision can still have the opportunity to play in VR too. There’s much more to gaming than just the visuals—there are stories, mechanics, and environments that are all fun to be immersed in! While our solution isn’t comprehensive for everyone with vision-related disabilities, we feel our work in vision accessibility will make huge strides toward making XR more inclusive.
Owlchemy has been consistently exploring unsolved areas of gameplay and accessibility in XR. This time, our mission of “VR for Everyone” led us to ask the question, “Could a person with low or no vision play our VR games?”
PG: Much of our design process already included considerations for people with different vision needs. Even back during the development of
Job Simulator, the team opted to use large, clear text and brightly colored object highlighting to help our community play the game. Since then, we’ve continued to include these features and expanded our considerations to account for other impairments, such as color blindness.
When your team first started working on this update, what were your overall goals and how did you decide which features to integrate or modify?
JC: The goal of this team was to make
Cosmonious High more accessible to blind and low-vision players by allowing information to be accessed through audio descriptions using
Voice SDK’s
Text-to-Speech (TTS).
To accomplish this, we had a couple discrete objectives:
- Every object in the game should have descriptive text that players should be able to access and hear during gameplay.
- Players should receive auditory feedback that can help them construct a mental image of their space.
The teleportation beam in Cosmonious High pointed ahead to a location in front of a bus entrance.
With these primary goals guiding our initial designs, we decided which features would be integrated into the game for this update by prioritizing the features that would help get a player from the start of the game to the end of the main storyline.
Given the size of the game, we had to prioritize some features over others, but we didn’t exclude anything that would limit a player from completing the main story. It’s important to remember that when it comes to accessibility, there will always be additional work to do and more improvements to make—and that was something we had to remind ourselves of often. But we know that any step forward is good, and including this level of accessibility still invites many people to play our game and, hopefully, other VR games in the future.
Can you share some insight on the process of integrating TTS for this use case and how the “screen reader” functionality was adapted for VR?
PG: As we began thinking about integrating TTS for our players, we realized we had already built a system called World Items that contained a name and a short description of every interactive object in the game. Our first proof-of-concept leveraged this existing data from the World Items system to provide the text that would be spoken.
However, we still had to overcome a significant design challenge: turning the “screen reader” into an “environment reader.” Traditionally, “screen readers” read text on a 2D plane that’s easily navigable. VR adds another whole dimension, surrounding the player and involving them in the world physically. This led us to find creative ways for players to trigger TTS to learn about their surroundings and objects they were touching or holding in their hands.
Other design and technical challenges did arise, like dealing with overlapping audio from using both hands simultaneously, but we identified and tackled many of these issues early on.
Let’s dive into that a little more. What were some other challenges regarding the technical integration of TTS and Vision Assistance Mode, and how were you able to overcome them?
PG: The biggest technical challenge we faced was multi-platform support. As the amount of content in the game is vast, we used real-time TTS so audio descriptors would happen when a player requested them. This allowed us to iterate quickly, add new content, and then immediately hear the changes. This also meant we had to leverage speech synthesizers built into the platforms, but each platform has different methods and restrictions when interfacing with their synthesizers. It was a lot of work but ultimately better for the player.
Cosmonious High’s update has many other key features, including haptic feedback when an in-game object is highlighted. Can you describe your approach and strategy in including additional features like this one to provide players with a more comprehensive and holistic experience with regard to accessibility?
PG: Accessibility isn’t a one-size-fits-all solution. And as with most aspects of game design, the more ways you can signal something to a player, the more likely the player will recognize and understand what the design is trying to tell them. We knew that since we were developing the Vision Accessibility Update with blind and low-vision players in mind, many of our players using our features may not be able to see the highlights we had already been using to indicate interactivity. We needed a way to signal that a player could interact with an object without using visual cues.
To do this, there were only two senses left that we could leverage: hearing and touch. Due to the density of objects in our scenes, playing audio over the top of the existing audio would have been annoying and confusing every time the player could select or interact with a new object.
Players can also access tutorials with audio descriptions that offer visual and auditory guidance.
So taking a lesson from the existence and use of white canes in the blind community, we added haptic feedback whenever the player gets close enough to interact with a given object. This gentle vibration allowed us to simulate the feeling of lightly touching a physical object, which turned out to be discreet but effective at communicating the necessary information to the player.
Are there any differences in how you implemented the Vision Accessibility Update for Meta Quest devices vs. other platforms?
PG: When we started this project, we knew there would be differences in how this type of accessibility was implemented on different devices. For example, since we needed TTS to work online to generate audio dynamically, we found not all headsets supported a built-in TTS synthesizer. For Meta Quest 2 specifically, we explored many options and determined that the TTS feature of
Oculus Voice SDK was best suited to our needs for the platform. From there, Meta helped guide us on the steps to integrate the plugin with our existing code.
JC: Though the Oculus Voice SDK was very effective and allowed us to get TTS working on Cosmonious High for Meta Quest 2 when connected to the internet, we hope that in the future, there will be other methods of providing on-demand TTS to players without an internet connection—whether that’s through tools built into game engines or through the inclusion of TTS synthesizers on hardware. We believe that the more developers that have access to these features, the more we will see them used in games, and together we can raise the level of inclusion in VR.
We’ve covered a lot of ground, but before we get too far, what would be the first three steps you would advise taking if a developer is interested in integrating similar accessibility features?
JC: First, talk to as many people as you can with different abilities and hear about their experiences using technology. This includes visual, auditory, physical, and cognitive disabilities. Also, remember that one person’s experience does not represent everyone’s experience with that disability—some people can also have more than one disability with varying degrees of each.
Second, try as many XR experiences as you can and think about the various disability types. Ask yourself if those experiences accommodate needs that players may have, and get familiar with what’s working and what can be improved.
Third, learn about accessibility in games, including outside of VR. We’re fortunate right now that there are a lot of updates and improvements happening in the games industry around accessibility, so it’s important to keep up with the latest designs and community feedback.
Given the vast spectrum of visual abilities among players, how was your team able to test the new features and ensure they were effective with their target audience?
JC: Throughout development, we checked in with game accessibility consultants and playtesters who self-identify as players with low vision, legal blindness, and users of assistive technologies for vision accessibility. After each playtest, they would share their thoughts on the new features, what worked well, and what could be improved. Our team would prioritize the work, build it out, and run playtests again with the people we were building for. With the help of
VROxygen.com, we were able to find testers who matched our target audience and have developed these features.
PG: We also developed some internal tools that allowed us to simulate different types of vision impairments in the headset. We wanted to be sure we were valuing our playtesters’ and our developers’ time, so these features allowed our sighted developers an opportunity to catch and fix any obvious issues before the content was put in front of our playtesters. This greatly improved iteration time and ensured we were getting the most valuable and relevant feedback from our playtesting sessions.
Are there any tips or best practices other developers should know about when integrating similar accessibility features into their apps?
JC:
- Start with Meta Quest Virtual Reality Check (VRC) Guidelines and the XR Association’s Developer's Guide, Chapter Three: Accessibility & Inclusive Design in Immersive Experiences.
- Try providing your team with tools that allow you to make quick accessibility checks to find glaring issues rapidly. This could be a tool to check art with color blindness filters, checking readability of objects through levels of blur, and experiencing the game with no sight of the environment.
- Always playtest with a wide variety of people. Take time to seek out players with different identities and spend time to make sure people with disabilities are playtesting too.
Cosmonious High’s other accessibility features include one-handed mode, dynamic height adjustments, localization improvements, and more.
PG: We also recognize that VR is still a new medium with its own unique strengths and weaknesses. It’s important that we don’t merely copy the designs of our predecessors. Rather, we should learn why the designs exist in the forms they do, adapt them to suit the medium, and ensure that our work serves the needs of the people for whom it is designed.
Speaking about predecessors, when considering your other games, Vacation Simulator and Job Simulator, what was it about Cosmonious High that made it the right game to receive this update first? PG: We decided to add this update to Cosmonious High because it was our latest game and because we knew it would be the most challenging environment for us to develop this feature. The exploratory nature of the game, combined with the dense amount of objects, puzzles, and ways to interact in every scene, meant that attempting this would give us a complete idea of what we need to consider in future projects. We hope to use the learnings from this project to inform our designs going forward as we continue to expand accessibility in our games.
Have you received any feedback from players about the recent update? Is there any feedback that stands out specifically?
JC: In April 2023, we received some data that excited us all. Players have used Vision Assistance Mode to get object and location descriptions over 1.53 million times in the last month on Quest 2!
It’s been significant, and what we’re seeing the most is an indication that players want to see more of this type of accessibility in VR. We’ve also received questions asking if we’ll add vision accessibility to our previous games, Vacation Simulator and Job Simulator. We hope other studios hear this and dedicate resources to adding more accessibility to their games as well.
Not only do players want to see accessible designs and options, but it’s something that is really needed to make VR for everyone.
Looking forward, are there any other accessibility initiatives your team has their eyes on?
JC: Working on accessibility updates informs how we will develop our next titles. We’re working on a new game and are excited to continue bringing accessibility into that game! It’s part of our culture to think about designing with accessibility, so stay connected with us to find out about the next initiatives in accessibility at Owlchemy Labs.