AppSW Enhancements with DirectProjection and Compositor Layer SpaceWarp

Guodong Rong, Jian Zhang, Steve LanselHero Image für Blog
We recently announced Application SpaceWarp (AppSW) at Connect 2021, a developer driven optimization that can give suitable apps up to 70 percent additional compute. One of the key components of AppSW is the generation of motion vectors and depth by the application. This allows an app to run at half frame rate (e.g. 36 FPS) while the compositor still works at full frame rate (e.g. 72 FPS) via frame extrapolation and depth-based reprojection with minimal artifacts. AppSW launched with our v34 release, and the response from developers has been fantastic. We are ecstatic to see a significant number of projects integrating AppSW.
One issue developers have encountered is that applications can only generate the motion vectors and depth for projection layers (3D apps). For non-projection (a.k.a. compositor) layers, such as quad layers and cylinder layers, we cannot apply AppSW. As a result, if an app is running at half frame rate with AppSW or occasionally dropping frames, stutters or double images may appear when looking at the layers while translating your head. This is because the app only updates the panel’s location relative to your head at a lower rate, which causes a stale frame pose to be used in multiple consecutive compositor frames. For those who are curious to why we use the head pose submitted by the app, please refer to the appendix for more details.

DirectProjection

Fortunately, we can solve this problem. Because the compositor has full geometry information of the compositor layers (e.g. width/height for a quad layer, and radius/FOV/aspect ratio for a cylinder layer), it does not need to apply the normal reprojection techniques such as TimeWarp or SpaceWarp, and can directly project these layers onto the final image using the most recent head pose. We call this DirectProjection. For a world-locked layer, since its pose is fixed in the app space, the compositor can safely reuse the pose submitted in the last frame, combined with the most recent head pose, to directly project it to get the final image. Fig. 1 illustrates how DirectProjection works.

Compositor Layer SpaceWarp

However, if the layer is moving with respect to the app space, and the old pose is reused, we will still see the stutter even with DirectProjection. To resolve this issue, we can utilize the same idea as SpaceWarp -- extrapolation. For the compositor layer, we do not need a per-pixel motion vector. Instead, we can simply extrapolate its pose using the last two submitted poses. This will give us super smooth images for both static and moving compositor layers, even if the app is running at half frame rate. We call this Compositor Layer SpaceWarp. Fig. 2 illustrates how Compositor Layer SpaceWarp works together with DirectProjection.

Conclusion

Both DirectProjection (v34) and Compositor Layer SpaceWarp (v35) are available to Quest developers today. Note that both DirectProjection and Compositor Layer SpaceWarp do not need any additional information from the app. They are independent of Application SpaceWarp. These techniques are necessary for compositor layers to avoid a juddery appearance, particularly when apps are running at a low FPS.
DirectProjection and Compositor Layer SpaceWarp are very useful complements to Application SpaceWarp, which makes the compositor layers appear with a smooth motion regarding both HMD motion and world locked layer motion. DirectProjection and Compositor Layer SpaceWarp are both automatically enabled when Application SpaceWarp is used.

Appendix

Before DirectProjection is introduced, the compositor uses the head pose submitted by the app, and uses the most recent head pose to correct the rotation (via Time Warp). The reason behind this is that we do not want to break the dependencies between the projection layer and the compositor layers. For example, if an app wants to insert a quad layer into objects in a projection layer, to get the correct inter-layer occlusion without depth testing, it may “punch a hole” in the projection layer (i.e. render the quad layer into the projection layer but only set the alpha value to 0 and do not affect the RGB channels), and then render the projection layer after the quad layer. In this case, the translations of the projection layer and the quad layer should match each other, to avoid wrong occlusion/disocclusion. Since the projection layer uses the head pose submitted by the app, we do the same for the compositor layers to guarantee the match. With AppSW, we can now use DirectProjection while still having them match each other.
Quest
War diese Seite hilfreich?
Mehr entdecken
The State of VR at GDC 2026: Building a Sustainable Future
Explore the state of VR from GDC 2026: stronger app discovery, growing Meta Quest usage, more $1M+ titles, and much more.
Alle, Design, Hand-Tracking, Optimierung, Quest, Spiele, Unity, Unreal
Faster Builds, Smarter Discovery, and the LiveOps Playbook: What to Know After GDC Day 2
Explore Day 2 at GDC 2026: tools to speed up builds, optimize Store discovery, and learn LiveOps strategies from Gorilla Tag.
Alle, Apps, Design, GDC, Optimierung, Quest, Spiele, Unity, Unreal
Highlights from Day 1 at GDC 2026: Hands, Agents, Performance & More
Explore GDC 2026 Day 1: hand tracking design, AI-assisted Unity workflows, and data-driven retention tips for VR developers building for Meta Horizon OS.
Alle, Apps, GDC, Hand-Tracking, Quest, Spiele, Unity, Unreal

Get the latest updates from Meta Horizon.

Get access to highlights and announcements delivered to your inbox.