Color Management in Meta Quest Headsets
This guide reflects the current state of how Meta Quest performs color management in Meta Quest headsets. Even though it includes a brief introduction to color science, much of the information shared assumes some familiarity with the subject matter. As such, this guide does not provide a detailed deep dive on color science and color spaces, but it does expand on areas that are crucial to our Meta Quest-specific use cases.
The Meta Quest line of products has grown steadily over the years, and with the growth came the need for a consistent developer experience for display color reproduction. The original decision to use display panels from multiple vendors on Go, each with its own similar but visibly different color spaces, solidified this requirement. Otherwise, the same content would look vastly different from headset to headset. Once this color space management pipeline was established, it proved to be useful in subsequent products including Meta Quest, Rift, and Link.
The most important thing to understand about color management is that RGB tuples like (72,195,44) or (159,37,205), are not well-defined color values. This might seem like a pedantic comment, but consider that the color you see on a screen is formed from the combination of red, green, and blue primaries separated into subpixels, each of which has its own chromaticity and varies somewhat from device to device and pixel to pixel. Maximum (255,0,0) red on one screen may look substantially different from another screen showing maximum red. A well-defined color is a physical property of photons that is a function of how the average wavelengths stimulate the viewer’s retina.
The great benefit of light being a physical quantity is that we can measure the response of the color primaries of a panel and adjust how we drive them to achieve a more accurate representation of the intended color. To do that we need to specify what those intended colors are. This specification is called a color space. For nearly 25 years, the standard color space used by the industry for computer monitors (and by extension, most devices with color screens) has been the International Telecommunication Union (ITU) Recommendation BT.709, shortened typically to Rec.709. Although there are subtle differences, you sometimes see sRGB used interchangeably with Rec.709 as they share the same color primaries. sRGB was designed to be a relatively easy manufacturing target for CRTs and later adopted for LCDs, so that the color masks of computer monitors could be somewhat consistent across the board. Rec.709 and sRGB sacrifice maximum color saturation to get there, however, and so many color spaces have been devised to expand the range of colors that can be shown, particularly for photography and cinematography. For instance, Adobe RGB is identical to Rec.709 except for a much more saturated, greener green to better cover the range of colors on screen that can be printed by a CMYK photographic printer. The DCI-P3 color space increases saturation on green relative to Rec.709, but also dramatically deepens the redness of the red channel to help cinematographers work with those deep tones. One of the more important new color spaces is ITU’s Rec.2020, which specifically uses pure wavelengths that can be generated from lasers to define the red (630 nm), green (532 nm), and blue (467 nm) primaries.
In addition to primaries, color space specifications define a whitepoint which represents what chromaticity the response to maximum drive to the primaries should be. Rec.709 and Rec.2020 both use the D65 standard, which is approximately the apparent color of a black body radiator at about 6500 degrees Kelvin. This whitepoint is a major point of frustration as we explain later.
Note: You’ll see a recurring theme with the Meta Quest display whitepoint specifications. Even though color standards such as Rec.2020 and Rec.709 are specified with a whitepoint of D65, every single headset we have shipped as of 2021 has used displays factory calibrated to D75. This started with the Rift CV1 and for consistency, all subsequent display selections also use D75.
The first standalone Oculus Go device to get color space management was Go, which sourced display panels from two separate vendors, each with their own color spaces. Both panels have a native whitepoint close to D75, which has a bluer tone than D65. While one vendor’s primaries are nominally Rec.709, the other vendor’s green and red primaries shift towards yellow. Without correction, this shift, combined with the bluer whitepoint, gave skin tones a noticeable sickly pallor. To balance the color spaces and bring the panels back in line with existing standards, the default color space transformation on Go is to map applications as Rec.709 with D65 whitepoint.
Go Native Color Space
Meta Quest uses OLED panels which due to the very pure colors achievable from LEDs has much more saturated color primaries than Rec.709, particularly in blues and greens. The color space is somewhere in between Adobe RGB, DCI-P3, and NTSC 1953 color spaces. When compared to Rec.2020, the red primary of the Quest OLED falls a bit short, but is closer for green and blue primaries.
Meta Quest 1 Native Color Space Comparison
OLED chemistry isn’t very consistent, so the primaries and whitepoint can drift somewhat from panel to panel. Unlike with Oculus Go headsets where we use manufacturer specified color space data, the color response of every Meta Quest panel is measured at the factory individually using laboratory colorimeters so that we can better match color performance between the left and right panels.
Meta Quest 1 Native Color Space Variation
As with the Go panels, Meta Quest panels have a whitepoint closer to D75 than D65 which results in a blue cast on content relative to what would be seen on a calibrated monitor without correction.
Meta Quest 1 Native Whitepoint Comparison
Similar to Go, Meta Quest 2 uses LCDs panels, but unlike the Go panels that didn’t quite fit the Rec.709 color space, the Quest 2 displays closely follow the Rec.709 color space’s RGB primaries while still using a whitepoint that is very close to D75.
Similar to the Meta Quest 1 OLEDs, Rift CV1 uses OLED panels which can slightly vary between different units. While there are minor differences between the Rift CV1 and Quest 1 panels, they are negligible. We originally never intended to perform color space conversions for the Rift CV1 and only correct for various visual artifacts such as luminance variation across the display at different levels. So while on average we converged to similar results on Rift CV1, different units can show differences that vary more than what we get on the Quest 1 output.
Similar to Go and Meta Quest 2, Rift S uses LCD panels that provide fairly accurate results. The panels used in all production units are accurate Rec.709 displays but again use the D75 whitepoint. For the PC VR runtime, Rift S was the first HMD where we employed color space conversion to make sure the content looked just as saturated as it did on the Rift CV1’s OLED panels. We discuss this further in the
Color Correction Pipeline section.
Scope of Color Space Correction in VR
A color space normally defines a slew of aspects of the viewing experience. These include the display’s gamma curve (or transfer function), frame rate, luminance, resolution, viewing conditions such as brightness of the room, and so on. Some of these specifications do not apply to VR at all, while others such as gamma curves are handled separate from the color gamut as it applies to this article.
To that end, our color space management pipeline only handles the following aspects:
- Remapping R, G, B color primaries and whitepoint from the source VR app’s color space to the native display’s color space.
- Normalizing brightness to eliminate potential stereo-luminance disparity across the left & right eyes.
What about gamma correction?
Gamma correction is considered outside the scope of this article because it’s not handled explicitly by the color space pipeline, but directly by the GPU when reading and writing to texture buffers or render targets. Still, let’s briefly touch on it to get it out of the way.
Although actually defined as a color space, sRGB is better known among real-time graphics devs as a gamma curve than a color space even though the standard defines more specs as we briefly mentioned above. Rec.709 and sRGB share the same color primaries and we want to avoid confusion in our discussions when talking about color primaries in comparison to gamma curves. To that end, when talking explicitly about color primaries and not gamma curves, we say “Rec.709”. When referring to the specialized gamma curve defined by the standard, we say “sRGB.” In reality, Rec.709 also has its own specialized gamma curve, but as it’s not used for computer imagery, few VR developers should care about that.
The factory-calibration specs of our HMD displays are straightforward. All consumer HMDs that Meta has shipped as of 2021 either use an sRGB or a gamma 2.2 curve. Although sRGB and gamma 2.2 are extremely similar, they’re not exactly the same. In most cases the difference is negligible, but in some very specific cases this difference becomes evident. On the PC VR runtime we make sure that the content which might use sRGB gamma is accurately corrected to use gamma 2.2. However that means we run explicit shader math to do the conversion. On the Meta Quest runtime, the hardware doesn’t have the processor cycles to spare for this seemingly-minute difference. So it treats the display gamma as sRGB leading to some subtle differences in the darkest luminance ranges.
We expect that all rendered content sent from the VR app to the VR compositor is either encoded using 8-bit sRGB or a floating-point format. For example, R11G11B10F, R16G16B16A16F and so on. When the texture format is set correctly, the GPU performs the conversions when reading from and writing to these buffers making sure all of the shader math and texture samples are treated linearly. Keep in mind that GPUs only know how to natively handle the sRGB gamma curve and floating-point formats which are inherently compressed in a manner similar to gamma curves. Trying to roll your own gamma compression, such as gamma 2.4 which is similar to the Rec.709 standard, can be painful in various ways and is outside the scope of this guide.
For more information on the intricacies of the sRGB gamma curve and on handling gamma including the Oculus Rift pipeline, see
The sRGB Learning Curve.
What about display-brightness correction?