The Oculus line of products has grown steadily over the years, and with the growth came the need for a consistent developer experience for display color reproduction. The original decision to use display panels from multiple vendors on Oculus Go, each with their own similar but visibly different color spaces, solidified this requirement. Otherwise, the same content would look vastly different from headset to headset. Once this color space management pipeline was established, it proved to be useful in subsequent products including Oculus Quest, Oculus Rift, and Oculus Link.
To that end, we have compiled a
developer guide explaining the Oculus color management pipeline, including:
A brief overview of color spaces
A look at the color spaces of each of the Oculus HMDs shipped to date
The history of color management at Oculus and its reasons for inception
The scope and various aspects of what we mean by “color management”
How the color management pipeline evolved over time for Rift and Quest line of HMDs
How to correctly use the color management pipeline and corresponding SDK APIs
How color management functions under the hood in the VR compositor
Actionable items and recommendations for VR developers
Who is this guide for?Anyone actively working on or remotely interested in developing a VR app for the Oculus Quest, Oculus Rift, or Oculus Link should understand the color management pipeline. It is especially relevant if you are a graphics/rendering engineer, artist, technical artist, CG supervisor, or generally care about visual fidelity in VR.
Why should I care?In a regular non-VR real-time app, the image rendered by the app is sent with relatively minimal modifications through the display cable to a 3rd party display. In contrast, the images rendered by VR apps are not sent directly to the display. Instead they go to a VR compositor that does a slew of tweaks to the images before they are finally sent to the display.
In Oculus HMDs, since the VR runtime software is aware of the display hardware’s specifications, it allows the software to make calibrated corrections that might be hard to do for other generic display hardware. In an effort to make sure a given VR app functions consistently on different hardware, when the VR runtime is applying this correction it will default to a set of behaviors that are mostly aligned with your expectations, but in some cases you need to work with the system to guide it down the right path. However, if you don’t quite understand what the runtime is doing, it can cause confusion and frustration, ultimately costing precious time. A quick read through this
guide can help save countless hours down the line.
This isn’t hypothetical. Over the years we’ve seen many VR apps, including our own internal apps like Oculus Home and some of the most popular VR apps in existence, treat the color space pipelines incorrectly. As we’ve helped developers address these issues, it became clear that there wasn’t enough information on this topic.
Here’s an example of a real-world situation. In the image below, the top color chart shows what colors might look like in a badly set up case. Compare this to the bottom one which was correctly set up. These two chart images were directly captured from our VR HMDs, so ignore the curvature as that’s an unrelated aspect of VR rendering. It’s clear that the top chart has peachy (orange+pink) and overly saturated tones while the bottom chart looks more neutral.
Here’s the original source for reference.
Wow, this is a long guide. Do you have a summary?Please feel free to provide your feedback about this guide in the
developer forums.