As many of you might have already seen, we introduced Application SpaceWarp in our
Connect 2021 session. Application SpaceWarp (AppSW) is a developer-driven optimization technology that can unlock extra computation power for suitable content. In our initial testing, Application SpaceWarp gave applications up to 70 percent additional compute, potentially with little to no perceptible artifacts.
With the extra CPU/GPU performance that AppSW provides, we believe you can take your VR experiences to the next level. Now, with the V34 OS and SDK release, the feature is ready for full, production usage. We can’t wait to see you test it in your applications and share with us your hands-on experience.
Here is an overview of the technology and an introduction to the full developer package.
What is Application SpaceWarp?
Application SpaceWarp allows an application to render at half rate from the actual display refresh rate, for example 36 compared to 72 FPS. The application must render a motion vector buffer and a depth buffer, in addition to the standard eye buffer, which our systems will then use to synthesize new frames and still output 72 FPS to the display.
Just how much help does it provide?
When an application can render at half the rate of the display, it may seem that it would now have twice the amount of computation budget as previously. However, AppSW naturally has its own overhead; both the motion vector generation and frame synthesis take some time, with the motion vector generation adding to the total application process’s GPU cost, and frame synthesis adding to our compositor’s total GPU cost.
Thankfully, the overhead usually only takes away a small portion of the benefit. In our initial testing with a first-party application, we were seeing up to 70% more compute budget to use.
How does it work?
To synthesize a new frame from an old frame, we need to know where every given pixel will be, in a future time on our HMD display. One way to do this is through frame extrapolation and frame reprojection, with the help of motion vector data and depth data. The motion vector describes each pixel’s moving velocity, so we can use it to predict where the pixel will be in the near future. The depth buffer can tell how far the pixel is to the rendering camera, which would be used to do depth-based reprojection to reduce HMD latency.
There are many ways to generate motion vectors, but they mostly belong to two large categories:
Performing motion estimation by analyzing history frames
Rendering by the application analytically
AppSW is using the 2nd approach. The generated data is much more reliable (basically the ground truth) and it can be much higher resolution than the 1st approach. The technology is also widely adopted by the real-time rendering industry. If you’re familiar with standard graphics features like motion blur or temporal anti-aliasing, those techniques generate motion vectors (also called velocity buffers) using a very similar method to what we use. This makes AppSW a natural fit with game engines and 3D rendering applications.
For more details on how exactly motion vectors are generated in code, please reference our developer guide (
Unreal,
Unity,
Native). The following image shows what the motion vector and depth data look like in action. Please note the motion vector and depth data required by AppSW can be much lower resolution than the color eye buffer, which is one of the reasons why motion vector generation overhead is very low.
When a XR application is rendering at a lower frame rate, latency is one of the biggest challenges for a comfortable experience, since the frame pipeline takes longer and pose prediction becomes less accurate. To reduce latency when using AppSW, we delivered a full pipeline optimization with the following technologies:
Phase Sync: Coordinates the application frame and makes it start at the right time according to the application's workload. HMD and controller sensor reading is performed as late as possible to reduce both HMD and controller latency. Please see our
Phase Sync blog post for more details.
Late Latching: Further delays the sensor reading time to the end of the CPU render frame, which saves one frame of latency. Please see our
Late-Latching blog post for more details.
Positional TimeWarp (PTW): Asynchronous TimeWarp can correct HMD rotational error right before displaying, which is enabled in every Quest application. PTW can use depth buffer data to further correct HMD translation latency, which will be automatically enabled with AppSW applications. From a pure HMD latency point of view, we can even say AppSW apps have better latency than full FPS applications without AppSW when using PTW.
From a timeline perspective, the following diagram shows where each tech is on a frame pipeline.
Overall, Application SpaceWarp is not just about frame synthesize. It is a set of technologies combined together to deliver the best half refresh rate XR experience.
Limitations and Best Practices
Application SpaceWarp is a very powerful tool. But we want to make it clear that AppSW is not a silver bullet and may not be suitable for every type of application. Ultimately, it is the developer's responsibility to decide how and when to use this tool and to thoroughly test the application to confirm there are no graphics glitches or regressions in the application.
There is a lot we can talk about on this specific topic, like transparency rendering, motion vector accuracy, controller latency or image distortions. The fundamental approach is to understand the basic theory of technology, which can not only help you identify the potential problematic usage of AppSW, but also help you to figure out workarounds and solutions. We encourage you to check out our
Developer Guide and its Best Practices section.
Full Developer Package
With the V34 OS and SDK release, the full AppSW developer package is available. Here’s what are included:
Unity: Unity’s AppSW support is hosted on our github. We show a reference implementation about how you can integrate the tech into your application. Check
here for details.
Unreal Engine: We shared an one check-box solution on our UE4.27-V34 branch. Check
here for how to use it.
Native Application: Our
AppSW extension is published in the official OpenXR spec. To understand it better at the source code level, we included a source code sample XrSpaceWarp in our
SDK package. Check
here for our Native developer guide.
Developer Education: Highly recommend reading our comprehensive AppSW Developer Guide (
Unreal,
Unity,
Native), which describes almost every aspect of the technology, from tech explanation to integration to best practices.
Conclusions
When utilized correctly, Application SpaceWarp is a powerful feature that can give applications significant additional compute. There may be corner cases or parts of your application that result in other artifacts than what we mention in our documentation. You can choose to mitigate them by adjusting content or finding your own workarounds. The key is to understand the technology and your application’s graphics pipeline. Releasing AppSW is just the beginning of the journey. We’re anticipating hearing questions, feedback and bug reports from you, all of which are appreciated. We are very much looking forward to collaborating with you.
FAQ
What is the difference between AppSW and Rift ASW?
For people who have paid close attention to the rendering features that Oculus has supported for many years, you may be familiar with
PC ASW, which was previously released for Oculus Rift apps. Even though there are a few similarities, we want to call out that PC ASW is a very different technology than Application Spacewarp which we’re introducing today.
PC ASW uses motion vectors generated from analysis of historic frames, whereas AppSW uses application-generated motion vectors. Since the application has all the information about object movements, the motion vectors don’t rely on any estimation/guesswork to be generated. This results in AppSW using significantly higher quality motion vectors, much closer to what the “ground truth” motion vectors would be, which results in better overall visual quality.
At the feature type level, PC ASW is a system feature, which will be triggered automatically on low spec machines. On the other hand, Application SpaceWarp is a completely developer driven feature. Developers can manage the feature’s activation at the development / design stage. More than that, developers have complete control on every single motion vector / depth, which leaves a lot of space for developer innovations , eg. on optimizations and challenging cases handling.
Ultimately, you’ll see how AppSW creates a scenario with significantly both 1) more power but also 2) more responsibility, for you as the developer, because even when using PC ASW, you still were targeting an experience for many/most end users to be running at full-frame rate, and ASW was just something that kicked in when needed. The extra compute power that PC ASW gave you would not be used too often.
With AppSW on the other hand, now you have the power to have your app run at half-rate all the time. This gives your overall app the possibility for a significantly higher compute budget, but it also means that any AppSW artifacts will now be much more serious, since they will apply to all users of your apps, not just to a small percentage.
Will I always get 70% extra compute?
Not necessarily always, but hopefully close. AppSW’s efficiency will be impacted by many variables like scene complexity, graphics complexity, targeted frame rate, display resolution, and more. Based on your actual content, the exact magnitude of the performance win will be different. Our “70%” figure was derived from testing on some internal UE4 titles on Quest 2. We think these titles are representative for general content, but you might get different numbers for your content, more or less than 70%.
Which refresh rates does AppSW support?
AppSW supports all available refresh rates: 72/90/120 on Quest 1 and Quest 2 headsets, and the corresponding half rate AppSW application render rates are 36/45/60. It's the developer's choice to decide a targeted refresh rate.
When targeting a higher refresh rate, the overall latency will be better, and AppSW will have less artifacts. However, since AppSW’s compositor overhead is fixed, the average AppSW overhead will also be higher in terms of percentage.
As an example, the 70% extra compute figure is for the difference in compute between standard full-FPS 72 FPS apps, vs. 36/72 AppSW apps.
The performance win will be smaller if you choose higher refresh rates; i.e. 45/90 AppSW will give you less of a percentage win than 36/72, and 60/120 will be even less.
Is late-latching mandatory for AppSW?
We encourage developers to enable late-latching to improve latency, especially controller latency.
UE4 late-latching has been available since UE4.23 on Oculus github, and Unity late-latching is in the experimental feature stage. To avoid complicating the integration process in your Unity apps, we recommend first making sure AppSW works correctly in your application, and only afterwards enabling late-latching on top of it. Please let us know if you see any bugs with either feature.
Is OpenXR mandatory for AppSW?
Yes, we only support AppSW under OpenXR, since we are fully committed to supporting OpenXR APIs moving forward. In Unity apps for example, you can still use the Oculus XR Plugin, but have it use an OpenXR backend.
Is Vulkan required for AppSW?
Our UE4/Unity AppSW integration is only supported for apps using Vulkan + Multiview.
Technically, if you are a native developer using GLES, AppSW will still work, but supporting late-latching will require Vulkan’s memory model, so we generally recommend using AppSW under Vulkan.
Is AppSW an “all-or-nothing” feature?
No. You can turn on AppSW at any time on a per-frame granularity. For example, in our simple Unity test app that we shipped with our branch viewable on GitHub, hitting the “B” button will toggle AppSW on/off per-frame.
As a result, don’t feel pressured to have the feature either always on or always off. There can certainly be some scenes in your apps where you have it on, and other scenes where you have it off.