Public VR Critique #4: Omega Agent

John CarmackBlog Hero Image
In 2015, John Carmack published a series of public critiques of the first batch of Gear VR titles to provide guidance for future VR development. We are republishing these six critiques on the Oculus Developer Blog. Note that the critiques were requested by the developer and that current versions of these titles could have been updated to address items noted in these posts.
Original Publish Date: July 29, 2015
This is my favorite Gear VR game, but you need a strong stomach to enjoy it. The general rule is that for maximum comfort in VR you want to avoid accelerating the viewpoint – either stay stationary, or move at a constant velocity. Flying around in a jetpack, you are ALWAYS accelerating. This has… consequences. I have a pretty high VR tolerance, but I usually stop playing this not because I get bored, but because I’m starting to feel uncomfortable.

The two big things the game does well:

  • I feel immersed in a virtual world, which is the whole point of VR. First person experiences with navigation do this most powerfully than stationary and third person experiences, and being able to see across the entire island gives a great sense of freedom. There is a sense of mystery when you see strange things in the world, and you know they do something, but you don’t know what yet. Discovering hidden passages that lead to interesting places is very rewarding. I feel like I want to “go back there,” which is something only a couple other games have done for me.
  • The game is full of gameplay. There are a number of titles that are really one trick ponies, and I often find myself wanting more variations on the basic gameplay. Omega Agent has a simple set of fundamental actions – get things, go places, shoot things, and race time, but it mixes them up into a lot of different missions with a good balance of multiple levels of achievement and gated tiers. The missions usually feel tensely time pressured, but you have the option of free exploration mode, which still has the collection elements in it to add some spice.

A few criticisms:

The briefing room wrap around wall screen and desk monitor is very good VR in concept; you can present a lot of information that is much more intuitively accessible by just turning your head than by swiping or paging through via some explicit UI. However, the technical details of the presentation hurt the legibility. During gameplay, the clean, nicely anti-aliased graphics don’t draw attention to themselves, which contributes to “seeing past the medium” and letting you get into the experience. Reading the text in the briefing room took a conscious effort, and I was keenly aware of the rendering issues.
To render the curved screen, text is first rendered to an off screen monitor buffer, which is then used as a texture for the curved mesh in the world to render to the two stereo eye buffers, which are then used as textures in the final time warp / distortion rendering to the actual screen. Resampling the extra time has a guaranteed loss of crispness, but it also gives another few ways for quality to slip.
It is hard to tell with the extra level of resampling, but it looks like the off screen texture is being point sampled, based on seeing tiny head movements cause multiple pixel jumps in the text. Changing this to linear filtering will make it a little blurrier, but it will be smoothly continuous, and easier to read.
It might be better to get rid of the intermediate texture buffer altogether and just have three angled flat panels for the display, rendered directly to the eye buffers, but if you stick with an intermediate buffer, this sort of static positioning can be optimized for best resolution. 12 pixels per degree of visual arc will get you close for the standard 1024 pixel eye buffers, but it is useful to do a mip map test when you really care about sampling resolution. Build a texture of your chosen resolution, but add mip map levels to it and explicitly put red in mip level 0, green in mip level 1, blue in mip level 2, and continue with other colors all the way down if you don’t explicitly set a max mip level. Enable trilinear filtering, and run the game to look at it. If it shows up as solid red, you should increase the texture resolution. Ideally you will see solid red in the middle of the screen when looking right at it, shading to a little bit of green towards the edges of the screen. This means that the texture is slightly oversampled in the middle, going to slightly undersampled at the edges, which is usually the compromise you want. Err on the side of more red, rather than more green – a little blurrier is better than a little more aliased.
With the buffer optimally sized, you can then make sure that the font and images you are rendering to it are mapped 1:1 from texels to pixels. In that case, it is ok to use nearest sampling to avoid the chance of messing up the half texel offset for texcoords and getting middle-filtered values.
Even with optimal sampling and resolution, the text should still be bigger for concentration-free reading. Designers used to building games for crisp 2D monitors need to recalibrate for the relatively low resolution and blurry world of VR. If you are an old-timer, composite CRT TV is a good reference point. Less words, bigger text.
Some decent quality voice acting would add a lot to the experience, obviating the need to read text altogether for some people, leaving it as just extra flavor for the game, rather than a task for the user.
One other trivial thing in the briefing room – there is a poster on the wall that is partially occluded by the desk lamp. You immediately want to lean a little to the side to get a better look at the poster, which is when you are reminded that Gear VR doesn’t have position tracking. Avoid making the user want to do something that you aren’t going to be able to let them. The lighting is also a little odd – when the lights dim, the poster goes completely black. The bright monitor implies a lot of global illumination, so I would stop the fade short of zero for a little ambient illumination.
I think the standard control mode should be swivel chair, or at least the options should be presented more prominently. I had some trouble finding how to set the mode when I played. Stick yaw turning is so uncomfortable (on top of an intrinsically uncomfortable game), and it also hurts the immersion a lot. I worry that people that try to play that way will have a poor experience.
I know the “static dome” idea came from someone at Oculus as a suggestion for adding comfort with stick yaw, but I do not endorse that technique at all – I think it does quite a bit of damage to the experience.
Almost everyone is going to see how high they can fly, so it is worth a little effort to fix the horizon. I’m not sure how much the current visual effect is due to the dome or just far Z clipping, but it doesn’t look right. Fiddle around a bit to get sky and water to accurately meet at the horizon.
Sound is pretty good, but variable wind noise would be an effective speedometer above a certain threshold velocity, and terrain specific landing sounds would be nice.
The one thing that didn’t really work for me with the flat shaded aesthetic was the opaque polygonal smoke clouds after enemy kills – I was puzzled about the “floating rocks”.
Looking past 90 degrees up or down does a flip-turn. Consider looking at both the forward and up vectors together to disambiguate.
The side indicators on the jetpack cage are too far at the edge of the field of view. There is some kind of hysteresis effect that sometimes has one side or the other pushed nearly off screen.
There is a little patch of overlapped polygon depth fighting in the side passage off from one of the road tunnels under the city.
When a silo-pipe LOD object off to the right of the waterfall pool disappeared, it left a visible geometry hole. If you were right over it looking down when it went away you saw solid ground, but when far off to the side it had an open hole. Most of the LOD objects just pop in and out, but that one fades in and pops out, presumably due to also being a portal of some kind. The LOD popping doesn’t bother me terribly, but fading everything in and out would look better if it wasn’t a large performance hit. For a game like this that is not going to be texture or shader bound, there are likely optimizations to be had in the vertex attribute packing if you need to claw some performance back.
Design
Did you find this page helpful?
Explore more
Accelerate VR Development with AI & Immersive Web SDK
Just describe your VR experience. An AI assistant builds, tests, and validates it for you, so you can focus entirely on creative vision and unique gameplay.
All, Apps, Design, Quest, Web VR, WebXR
GDC 2026 Highlights: What's Next on Meta Horizon OS
Catch up on GDC 2026: where VR is headed, what's new in Meta Horizon OS, and the tools and Store updates helping developers ship faster.
All, Apps, Design, GDC, Games, Quest, Unity, Unreal
The State of VR at GDC 2026: Building a Sustainable Future
Explore the state of VR from GDC 2026: stronger app discovery, growing Meta Quest usage, more $1M+ titles, and much more.
All, Design, Games, Hand tracking, Optimization, Quest, Unity, Unreal

Get the latest updates from Meta Horizon.

Get access to highlights and announcements delivered to your inbox.