Accelerate VR Development with AI & Immersive Web SDK

Blog Hero Image
Last year at Meta Connect, we launched the Immersive Web SDK (IWSDK), our open-source framework for building VR on the web. It handles the hard parts of VR development (physics, hand tracking, grab interactions, movement, and spatial UI) so you can focus on your idea and being creative.
Today, we’re introducing a fully integrated agentic workflow for IWSDK to further accelerate your VR development. The AI writes code, tests interactions, and fixes bugs alongside you. Developers, game designers, artists, and everyone in between can open an IWSDK project in an AI coding tool and go from idea to working VR experience without writing code yourself. No headset needed.
To test the efficacy of IWSDK with agentic workflows, we revisited one of our showcase WebXR apps, Project Flowerbed. This immersive VR gardening experience delighted users and inspired developers upon release, originally requiring tens of thousands of lines of custom code.
Using the agentic workflows, we leveraged the existing creative assets of Project Flowerbed and rebuilt the codebase for the app from the ground up in only 15 hours. This isn’t fixing a typo or generating boilerplate. It’s a full, interactive VR experience for web, rebuilt by AI using IWSDK.
If you love making VR experiences, want to start prototyping for VR, or are looking to understand how to incorporate AI into your workflows, consider this your invitation to start.

Why Build VR on the Web?

WebXR (the open web standard for running VR experiences in a web browser) lets you ship to desktop and all VR devices at once, and users can access your experience instantly through a link. No app store, no downloads. On Meta Quest alone, an average of one million users every month engage with WebXR experiences.
The web is also uniquely suited for AI-assisted development. There's no lengthy compile step, so the AI can write code, reload, and see results instantly, iterating through multiple cycles without your input. That makes it a great place to start, especially if you're new to VR development.

The Agentic Loop

In practice, agentic workflows mean the AI does more than generate code; it also tests and validates it. This closed-loop system is essential for high-quality, reliable results. IWSDK’s AI integration closes this loop entirely, offering developers maximum productivity.
IWSDK works with your favorite AI coding assistant including Claude Code, Cursor, GitHub Copilot, Codex and other tools. Using these tools, developers can build high quality VR experiences with IWSDK using the full agentic loop.
  • Knowledgeable code generation: Agents generate high quality code that follows framework patterns and best-practices because they can access and traverse the complete IWSDK documentation and codebase.
  • Scene understanding and testing: Agents take screenshots of the VR scene and understand the position, placement and nomenclature of every object, so they can make informed decisions. In addition to understanding the scene, agents can interact with the world. The AI moves around, picks up objects, presses buttons, and simulates hand gestures, testing whether things work in a dynamic VR environment.
  • Validation: When something is wrong, the agent can make changes on its own and retest. It can cycle through this loop until it is able to find a solution. The agentic loop also makes it easier for anyone on your team to fine tune the experience with feedback like “that feels too fast” or “move those closer together.” It maps your words to the right objects, adjusts their settings, and visually confirms the change.

Getting Started: Try it for yourself

The best way to understand the value of the agentic loop and how you can apply it to your VR experience is to try it yourself. IWSDK makes it simple to dive in and start building:
  1. Create your project. Run one command (npm create @iwsdk) and pick which AI tool you use (Claude Code, Cursor, GitHub Copilot, or Codex). Everything is automatically configured with no extra setup.
  2. Describe what you want. Open the project in your AI tool and provide your idea as a prompt, such as: "Build me an RPG-style VR game with neon blocks and a laser sword."
  3. Watch it build. The AI writes the code, loads the scene, tests interactions, finds bugs, fixes them, and repeats, all without you touching anything. You can also enable the collaboration mode to work with AI together inside the same browser tab.
  4. Refine by prompting. Say: "The blocks feel too fast." The AI adjusts the settings, confirms the change with you, and saves it.
  5. Share with a link. Ask your AI coding tool to help you deploy to any hosting provider (GitHub Pages, Vercel, Netlify, etc.) and send the URL. Anyone can open it in VR.
What you need:
  • Node.js 20+
  • Chrome or Edge browser
  • An AI coding tool (Claude Code, Cursor, GitHub Copilot, or Codex)
  • Recommended: A Meta Quest headset (Quest 3, 3S, or Pro) for on-device testing, but not needed to get started
Run npm create @iwsdk in a terminal, open the project in your AI coding tool, and describe what you want to build: See our step-by-step guide for detailed guidance!

That's all you need to get started. But if you want to know what's really happening under the hood, keep reading. 👇

Under the Hood

IWSDK has a purpose-built AI integration layer that gives coding assistants something they've never had before: spatial understanding of a live 3D scene.

The Framework

The SDK is built on Three.js with a high-performance Entity Component System (ECS) powered by ELICS, plus Vite-based tooling for instant hot-reload. Out of the box it provides Havok physics, one/two-hand and distance grab interactions, teleport and smooth locomotion, full articulated hand tracking, spatial UI panels (UIKitML), 3D audio, AR scene understanding, and browser-based XR emulation via IWER, no headset required during development.

How the AI Closes the Loop

Your AI coding tool connects to an MCP (Model Context Protocol) server running inside the Vite dev server, which bridges to the browser over WebSocket. This gives the AI access to over 40 specialized tools across scene inspection, XR device emulation, ECS debugging, and semantic code search:
  • See the scene. Capture screenshots and query the full Three.js scene hierarchy to understand every object's position, components, and relationships.
  • Control XR input. Move the emulated headset, point controllers, press buttons, switch to hand tracking, and simulate pinch gestures, all programmatically through IWER's RemoteControlInterface.
  • Debug the ECS. Pause the simulation, step through frames, toggle individual systems on/off, patch component values live, and diff state snapshots to pinpoint exactly what changed.
  • Search real APIs. Query a local RAG server with semantic search with more than 3,000 indexed code chunks from the IWSDK codebase, so the AI always references real function signatures and patterns, not hallucinated ones.
Every project also ships with pre-built agent skills for common workflows (architecture planning, asset browsing, UI panel building, pointer interaction, model preview, and XR mode testing), a project knowledge file with best practices and performance constraints, and pre-configured permissions so the AI works autonomously.
For the complete tool reference and architecture documentation, see AI-Assisted Development: A Technical Deep Dive.
The Immersive Web SDK is open source under the MIT license. Get started at iwsdk.dev.
Stay up to date with the latest VR developer news: subscribe to our monthly newsletter, join the conversation in the Developer Forum, and follow us on Facebookand X.
All
Apps
Design
Quest
Web VR
WebXR
Did you find this page helpful?
Explore more
GDC 2026 Highlights: What's Next on Meta Horizon OS
Catch up on GDC 2026: where VR is headed, what's new in Meta Horizon OS, and the tools and Store updates helping developers ship faster.
All, Apps, Design, GDC, Games, Quest, Unity, Unreal
The State of VR at GDC 2026: Building a Sustainable Future
Explore the state of VR from GDC 2026: stronger app discovery, growing Meta Quest usage, more $1M+ titles, and much more.
All, Design, Games, Hand tracking, Optimization, Quest, Unity, Unreal
Faster Builds, Smarter Discovery, and the LiveOps Playbook: What to Know After GDC Day 2
Explore Day 2 at GDC 2026: tools to speed up builds, optimize Store discovery, and learn LiveOps strategies from Gorilla Tag.
All, Apps, Design, GDC, Games, Optimization, Quest, Unity, Unreal

Get the latest updates from Meta Horizon.

Get access to highlights and announcements delivered to your inbox.