Meet Immersive Web SDK: A New Era for Spatial Web Development
VR and the modern web share the same foundation: bringing people, ideas, and experiences together. Today, the browser is evolving from a 2D window of information to an immersive platform for spatial experiences.
But to unlock the full potential of immersive web experiences, developers need tools that make building them accessible and streamlined, just as engines like Unity and Unreal have set the standard for VR app development. That’s why today, we’re diving into the all-new Immersive Web SDK (IWSDK), a WebXR development toolkit that’s available now in early access.
Unveiled at Meta Connect, the IWSDK is our latest step toward making cross-device development more accessible with open source tools to power industry-wide innovation.
The starting boilerplate app with IWSDK features hand tracking and spatial interactions directly running in the browser.
Since Connect 2024, we've had an average of 1M+ users engage with WebXR experiences every month on Meta Quest, and the momentum is only growing. As a cross-platform medium that’s flexible and cost-effective, immersive web experiences lower barriers to entry and open the door to a broad audience.
Historically, immersive web experiences have been built using 3D frameworks that focus more on desktop and mobile use cases, resulting in WebXR developers having to build most of the VR-specific features and interactions themselves. But now, whether you want to build games like the narrative-driven puzzle adventure In Tirol, create e-commerce experiences like our recent collaboration with PUMA, or package websites as Progressive Web Apps for purchase in the Meta Horizon Store, the IWSDK provides you with a complete foundation. You can build fast, deploy quickly and unlock massive reach without reinventing the wheel.
Below, we break down the key pillars of the new IWSDK. Keep reading to learn more and be sure to check out the documentation or our dedicated SDK guide to get started.
Modern Architecture to Scale Your Projects
At its core, the IWSDK was designed to support building with speed, scalability, and growth. It’s built on top of Three.js and paired with a high-performance ECS implementation to give you a strong foundation for building immersive web apps that can grow in complexity and size without sacrificing performance. This approach makes it possible to publish reusable systems and components that accelerate development.
We’ve invested in improving the baseline Three.js WebXR experience to make creating immersive web experiences easier and address common pain points. Some highlights include:
Customizable input management: We developed our own custom XR input management system that separates input visualization, input spaces, and gamepad utilities. This makes controller input more reliable and reduces draw calls by up to 70 percent.
Persistent input spaces: Objects like wrist UI or grabbed items stay attached even when devices disconnect or change, so you can focus on features instead of edge cases.
Smarter gamepad utilities: Our stateful gamepad wrapper automatically handles button mapping across different devices to make code more portable and predictable.
Pointer-events for interaction: Both ray-based and proximity-based interactions can now emit W3C-compliant pointer events. You can use familiar patterns like addEventListener('click', ...) to create more natural, web-like interactions.
We’ve packaged these improvements as @iwsdk/xr-input so they can plug into your existing Three.js WebXR projects without requiring you to start projects from scratch when you first use the IWSDK.
A Developer-First Workflow to Build Smarter
Build with Speed
The previously fragmented nature of immersive web development made acceleration and cohesion key priorities when building the IWSDK. Now, instead of spending hours configuring tools or dependencies, you can launch a new, custom boilerplate app in under a minute. All you need to do is run the npm create @iwsdk@latest command and answer a few multiple choice or yes/no questions and your project will be ready to go.
We’ve even simplified the entire workflow so there’s only two commands you need to know from the start:
npm run dev is intended to run for the entire duration of development as the local development server, processing file changes and triggering live reloads with a file watcher. The workflow injects emulation tools for testing on desktop, generates components from Meta Spatial Editor if needed, cuts out unused assets, and compiles UI definition files for the SDK.
npm run build gets your project ready for production by generating components, compiling UI files into JSON, and enabling you to optimize your GLTF assets with configurable compression settings.
Build with Flexibility
If you’re looking for easier ways to create 3D scenes, our Meta Spatial Editor (Windows | Mac) was built to work with the IWSDK. The Spatial Editor enables teams to visually arrange objects, place 3D models, and add simple business logic using components without writing code, while the IWSDK handles custom component creation for complex behaviors, game logic, dynamic content and performance from code developers.
Developers can now work in parallel with technical artists when building WebXR apps with the Spatial Editor integration. With the hot reload feature, Editor changes are also reflected in the local server when you save the Editor project to speed up iteration.
The Meta Spatial Editor enables you to easily create scenes and manage components for your entities.
Build with Performant UI
Months ago, if you wanted to build the spatial interfaces, you’d choose between embedding 2D HTML and sacrificing performance, or building 3D UI panels from scratch with rigid systems. Now with our new spatial authoring language UIKitML, you can use familiar HTML-like syntax and CSS-like styling to author UI, preview it in real-time with our new VS Code extension, and compile the files into performant 3D geometry.
Web developers can use familiar syntax to accelerate the building UI panels and preview them in real time.
With IWSDK’s DOM-like Javascript APIs, you can also integrate your UI panels seamlessly using familiar methods like querySelector and addEventListener, then position them and enable support for ray-based interactions using our pointer events system.
The result? A user-friendly workflow where creating immersive web experiences feels more connected, familiar, lighter, and collaborative.
Production-Ready Systems to Get Ahead
To help you hit the ground running without sacrificing quality, we’ve included prebuilt systems that support key features and address some of immersive web developers’ biggest challenges.
Locomotion
Ironically, one of the biggest hurdles preventing developers from moving quickly when starting XR development is… locomotion. That’s why we designed a BHV locomotion system to address the complexities caused by 3D math, performance management, and user comfort. The package includes:
Environment support: Static and kinematic environments for different scene types.
Movement modes: Slide locomotion with vignetting, teleport with configurable range, and turn with smooth/snap options.
Performance optimization: Worker-based simulation with main thread synchronization and smoothing.
User comfort: Configurable vignetting intensity and motion sickness reduction features.
Grab Interactions
IWSDK simplifies the complexities of grab interactions so you don’t have to handle complex collision logic, inconsistent input across devices, and tricky state management. Now you can use and combine six different grabbable components to support complex interaction behaviors. These include one or two-handed grabs and a distance grab that feels intuitive.
The IWSDK supports six different grabbable components for complex interactions.
Spatial Audio
Wrestling with 3D positioning calculations, distance attenuation models, and audio lifecycle management are tedious tasks. Our spatial audio system solves this by enabling you to integrate authentic audio via attaching the audio component and configuring a few preferences. The system then handles the heavy lifting automatically.
Scene Understanding
The IWSDK enables you to tap into our Scene understanding system and alleviate friction when building for mixed reality. It automatically detects planes and meshes like floors, walls, and tables, and represents them as entities in your scene. Spatial anchoring handles the complex math and ensures your virtual content persists across sessions in real-world locations.
IWSDK also supports mixed reality web experiences leveraging Passthrough
Get Started Today
With the IWSDK, every type of web developer now has the tools to start building the next frontier of immersive web experiences.
To get started visit the documentation or our dedicated SDK guide and follow the steps to access the core SDK package, standalone Three.js packages and development workflow plugins. You can also watch our 9-minute introduction video to learn more.
Like this content? Follow us on X and Facebook to be among the first to hear about the latest updates. For more details and insights, don’t forget to subscribe to our monthly newsletter and check out the Meta Horizon OS release notes. If you have feedback about any of our tools and features we covered today, be sure to let us know via the Feedback Tool in Meta Quest Developer Hub.
Get the latest updates from Meta Horizon.
Get access to highlights and announcements delivered to your inbox.
Discover holiday trends that boost engagement and spending on Meta Horizon. See when users tune in, learn how to get discovered, and explore strategies to make your worlds and apps stand out this season.
Shovel Up! Goes Remixable: Best Practices for World Builders
Shovel Up! is setting the stage for a new era of community-powered worlds by officially becoming remixable and open source. Discover best practices to make your worlds easier to remix, maintain, and help the community collaborate and innovate even faster.
All, Apps, Design, Games, Horizon Worlds, Mobile, Quest