Build Faster, Earn More: Agentic Tooling for VR

Blog Hero Image
The same qualities that make VR uniquely immersive also make it uniquely demanding to develop for. It requires a constant juggle between code, documentation, and hardware while learning new immersive concepts that are different from flat-screen development. We know many developers are now leveraging AI to accelerate their workflows, and we want to bring these accelerants to building for VR, from prototyping to feature integration to performance profiling and more.
In our kickoff post, we laid out the two commitments to help developers Build Faster and Earn More. This is the first entry in our three-part "Build Faster" series, designed to help you move from idea to immersive experience at the speed of thought.
TL;DR
Build VR apps faster with new Agentic tools that have direct access to Meta's VR Documentation, VR-specific knowledge, devices, and platform tools.
  • Instant VR Expertise: Use hzdb and drop-in agentic skills to connect your AI directly to the Quest knowledge base.
  • Agentic Workflow Use Cases: Streamlined rapid prototyping, integrating interactions, and performance optimization.
  • Platform Support: Works with Cursor, Claude, Copilot, open-source models, and more across Unity, Android, and WebXR. Getting started guides below.

The Foundation: CLI, MCP and Agentic Skills

Whether your company is fully AI-native or just starting to explore the space, AI-assisted development can greatly accelerate your existing workflows. When you combine agentic workflows with our suie of AI tools, your agents can understand VR specifics like hands and passthrough, troubleshoot frame drops on your Quest 3, and quickly create immersive prototypes. We built Horizon Debug Bridge (hzdb) to close this loop.

Connecting agents to VR Development via HZDB, Meta’s CLI and MCP

Hzdb gives both you and your LLM access to our knowledge base, your headset, Quest developer tools, performance trace analysis, the Meta 3D asset library and more. These tools fit seamlessly into your preferred workflows and are accessible with your favorite AI coding assistants, including Cursor, Claude, Gemini, and VS Code. When you use hzdb through your AI agent, the MCP client requests the list of available tools from the server and forwards your query, along with tool metadata, to the LLM. The LLM determines which tools to call, the client invokes them through the server, and the results are sent back to the LLM to generate a response.

Give your agent VR Expertise with Quest-specific skills

We provide Quest-specific skills in our Agentic Tools Repo to give your agent VR expertise from day one. Skills are modular capabilities that extend an agent's functionality and are used automatically when relevant. Our skills cover everything from project setup, to tools installation, to performance optimization and more. We’re continually adding new skills, so make sure to star the repository to keep up with the latest updates.

Agentic Acceleration Use Cases

It can be daunting to know where to start with agentic tools, even when you are familiar with the workflows. Here are some quick wins you can try today to accelerate your VR workflows with AI.

Go from idea to playable prototype in a fraction of the time

Validating a new idea or game mechanic usually means hours of boilerplate setup before you can even test whether the concept works. That's why prototyping is one of the best use cases of agentic workflows to quickly validate new ideas or additions to your existing game. The combination of our suite of AI tools and our Meta Asset Library (over 1.5 million royalty-free 3D assets with varying levels of detail) help you go from idea to prototype in less than a day.
Here are two videos that show you step by step how to quickly build prototypes with AI tools:

Add VR features like hand interactions without the ramp-up time

Input is one of the key features that sets VR apart. Our Interaction SDK makes it easy for developers to support VR-specific interactions like controllers or hands. For those using agentic workflows, our AI tools help your agent better understand ISDK architecture and integration, allowing it to implement inputs properly. It can configure hand interactions, set up grab and ray interactors, and connect input mappings based on your scene's needs. What used to potentially take days of documentation reading and trial-and-error now takes a fraction of the time.
Whether you are exploring hands support for your controller-based VR game or want to quickly spin up high quality inputs in a new game, try out the following prompts:
  • Existing VR project using ISDK -> use the prompt below to implement hands in part of your game to test it out -> "Enable Hand Tracking support"
  • New VR project -> use the prompt to add the ISDK comprehensive rig to your project -> "Add Interaction SDK support for hand and controller interactions" then prompt which objects you'd like to make interactable (e.g., "make the frisbee grabbable", "make the UI raycastable", etc..)

Pinpoint performance bottlenecks and quickly analyze messy trace data

VR is a compute-intensive platform and developers are often pushing our devices to the max. Good performance optimization is essential to making a great VR experience, but can be time consuming and difficult. Identifying bottlenecks, parsing traces and root-causing frame spikes requires specialized knowledge.
The Horizon Debug Bridge provides performance and analysis tracing tools via the "perf" command, identifying bottlenecks in seconds rather than hours. Once your project is connected, in your CLI try "hzdb perf capture --duration 10000 -o my_perf_trace" which will capture a 10-second Perfetto trace from your headset. Once you've done this, you can analyze it by prompting Claude: "I captured a Perfetto trace with session ID `my_perf_trace` while playing my game on Quest. Are any frames missing the 72fps target? What are the top 3 things eating the most frame budget? Break it down by GPU vs CPU." hzdb reads the trace and gives you a full breakdown of what's happening on your headset, including frame timing, GPU costs, thread scheduling, and where the bottlenecks are. You can dive deeper into performance optimization by checking out the Optimize Unity Meta Quest VR game performance with AI guide.

Get Started with Agentic Tools Today

This post covered the foundational aspects of our agentic tools and how to leverage them for some key use cases. To get started, check out our tutorials and other education guides below across our supported build paths:

Unity Developers

Android and Mobile Developers

WebXR Developers

Next up in the Build Faster, Earn More series: a Unity-specific deep dive covering AI Gateway, Meta XR Extensions, and how to go from empty project to running on headset with agentic workflows. Check back soon and follow along all quarter on Facebook and X.
To get all of the latest updates delivered straight to your inbox, subscribe to our monthly VR developer newsletter.
Design
Hand tracking
Optimization
Quest
Unity
WebXR
Did you find this page helpful?
Explore more
Build Faster. Earn More: How We’re Helping VR Developers Succeed on Meta Quest
New AI development tools and monetization features are coming to Meta Quest. See what we're shipping to help developers build faster and earn more this quarter.
All, Debugging, Design, GDC, Optimization, Quest, Unity, Unreal
VR Developer April Roundup: Open Tools, Smarter Profiling, and a Predictable Roadmap
This month's VR developer updates: Haptics Studio goes open source, Runtime Optimizer gets AI analysis, plus GDC VODs and OS v201 details.
All, Apps, Debugging, Design, GDC, Games, Optimization, Quest, Roadmap, Unity, Unreal
Accelerate VR Development with AI & Immersive Web SDK
Just describe your VR experience. An AI assistant builds, tests, and validates it for you, so you can focus entirely on creative vision and unique gameplay.
All, Apps, Design, Quest, Web VR, WebXR

Get the latest updates from Meta Horizon.

Get access to highlights and announcements delivered to your inbox.