Unity, Meta Horizon OS, and the Future of VR at Unite 2025
Unite 2025 was packed as Unity developers filled the rooms, lobbies, and hallways talking about one thing: what they’re going to build next. For us, it was the perfect place to meet the people who already live in Unity every day and show how easily those skills translate into rich, performant, and immersive VR experiences on Meta Horizon OS.
Across four packed sessions and a busy booth on the show floor, we shared where the Meta Horizon ecosystem is today, where it’s heading, how studios are finding success, and how Unity developers can ship faster on Meta Quest with the latest tools, samples, and AI workflows.
If you couldn’t make it this year, keep reading below for a recap of what we announced and how to start building for VR with our latest tools.
An Event Built for Developers, Shaped by Your Experience
At this year’s event, our focus was helping Unity developers take their first steps, or next steps on their VR journey in ways that feel intuitive, efficient, and achievable.
Developer Advocate Dilmer Valecillos chatting with an attendee at the Meta booth.
Everything we brought to Unite was designed around common questions we hear from the broader developer community, such as ‘how do my existing skills translate to VR?’ ‘How can I build with new capabilities faster?’ and ‘what kind of experiences are doing well right now on Meta Horizon OS?’
Our answer was to deliver clear guidance, share our latest tools, and provide a closer look at where our platform is heading:
We showed how Meta Horizon OS and Meta Quest extend the capabilities Unity developers already love with immersion, presence, and mixed reality.
We shared audience insights and trends from the Meta Horizon Store so developers can build for emerging opportunities.
We introduced new tools and AI-powered workflows that remove friction and give teams more confidence while developing for VR.
We reinforced our commitment to your success with open source showcases, along with community programs through the Meta Horizon Start Program.
The Meta Horizon Start program offers exclusive support, resources, and membership opportunities.
The Meta XR All-in-one SDK and Unity put all of this together to give developers a robust and streamlined path towards building the next generation experiences for a growing audience on the world’s leading VR platform.
Next, take a look at our speaker session highlights.
Unlocking the Full Potential of Meta Quest with Unity
“Unlock the full potential of Meta Quest” with Developer Advocates Jake Steinerman and Dilmer Valecillos offered a closer look at the possibilities and opportunities for developers on our platform. Jake opened with a line from Marcel Proust setting the tone for a week of fresh insights: “The real voyage of discovery consists not in seeking new landscapes, but in having new eyes.”
Jake Steinerman presenting at Unite 2025.
That is exactly what VR offers developers through embodiment, immersion, and rich social connections. In VR, users enter their experience with their whole body, their hands, their voice, and their interactable surroundings.
These unique unlock unique categories of experiences, and we called out a few genres that are thriving across our ecosystem today:
Virtual playgrounds that focus on social chaos and emergent play.
Entertainment experiences that feel cinematic and larger than life in-headset.
Immersive games where you’re the hero, not just controlling one.
Productivity and collaboration where multiple monitors, giant canvases, and new input models actually make work feel better.
Mixed reality experiences that turn your physical space into a reimagined interactive environment using passthrough and spatial anchors.
Mixed reality experiences provide users with new ways to interact and experience their physical space.
From there, Jake and Dilmer moved from inspiration to implementation and walked through a concrete workflow for shipping on Meta Quest with Unity, centered on a mixed reality sample called Blossom Buddy.
The Blossom Buddy demo shown at Unite 2025.
Meta Quest Developer Hub as your command center
They started with Meta Quest Developer Hub (MQDH: Windows | Mac), our essential desktop companion app that keeps your workflow organized while allowing you to manage devices, deploy builds to your device, monitor performance, and download tools, SDKs, and samples.
MQDH keeps the repeatable and tedious parts of development fast and centralized so you can spend more time actually building.
From blank project to mixed reality prototype
For developers targeting Meta Horizon OS with Unity, the team highlighted a recommended setup:
Next, they touched on how the Project Setup Tool lives inside the Unity Editor and fixes the kind of configuration issues that would usually cost you an afternoon of community forum searches in just a few clicks.
With a project set up and ready, Jake and Dilmer layered in Building Blocks to show how quickly core VR features can be added to your experiences:
Passthrough to blend the player’s physical environment into the experience.
Hand tracking so players can interact without controllers.
A first look at AI Building Blocks, which let you bring your own LLM into your app with a few fields and configuration changes.
Building Blocks enable you to add core VR, multiplayer, and mixed reality features in a few clicks or via drag and drop.
In short, Building Blocks let you add complex capabilities in a few clicks so you can focus on design and what makes your experience unique.
XR Simulator and Immersive Debugger
Jake and Dilmer also showed how to keep iteration fast without living (completely) inside your headset with Meta XR Simulator, a lightweight OpenXR runtime lets you simulate headset and controller input using keyboard, mouse, gamepads, or Quest controllers. This tool enables you to test user flows in prebuilt or custom rooms, run multiplayer and automated tests, and record and replay sessions for detailed analysis.
At Unite, we specifically spotlighted XR Simulator 2.0, which expands these capabilities and helps you scale your testing environment through a rebuilt interface, updated framework, and faster start up times.
Even with tools like XR Simulator, going in-headset ensures your experience is production-ready. Immersive Debugger for Unity lets you get hands-on, in-headset with full visibility into your app. From a panel inside VR you can:
View logs and custom debug values.
Inspect and tweak gameplay variables.
Customize what shows up through a Unity Editor framework.
Our team also pointed developers to North Star, our open source Unity showcase that demonstrates high visual fidelity and best practices for interactions, full body tracking, and spatial audio on Meta Quest. Keep this reference project handy so you can explore, dissect, and learn from the latest techniques and features.
North Star offers best practices and techniques for building performant, high-fidelity VR experiences.
All in all, this session showed why you don’t need to trade iteration speed for immersion. With MQDH, XR Simulator, Immersive Debugger, and Building Blocks, you can keep your Unity workflow tight while still building for full immersion.
Exploring the VR Landscape: Who is Playing and What is Winning
In “Exploring the VR landscape,” Jamie Keane, Director of Product Management at Meta, zoomed out from tools and focused on the ecosystem itself. Her session answered some of the most burning questions from our developer community:
Who is buying Meta Quest headsets right now?
What kinds of VR experiences are they coming back to?
Are developers making money?
Jamie broke down the VR audience into distinct segments and showed how usage patterns have evolved as our platform has expanded from early VR enthusiasts to a broader mix of users, including younger and more social audiences. We highlighted how more than 300 apps on the Meta Horizon Store have generated over $1 million in revenue, with 10 apps breaking the $50 million mark.
This builds on the audience insights we shared at GDC and on our developer blog earlier this year, where we talked through how spending and engagement patterns are shifting as new cohorts come online. Some highlights included:
Meta Horizon Store growth continues, but the mix of titles driving that growth is changing. Social, multiplayer, and community driven games are especially strong.
Hand tracking and mixed reality aren’t novel or experimental features. They are consistently sought-out by users and serve as meaningful differentiators when they are woven into a game or app’s core loop.
New app categories are emerging across fitness, creativity, and personal improvement, giving Unity developers more ways to stand out beyond more traditional gaming experiences.
Attendees explored the latest hand tracking improvements at the Meta booth.
Jamie also spotlighted recent developer success stories, including Animal Company, a standout game from Wooster Games on the Meta Horizon Store. Today, the game boasts more than one million monthly active users, over one billion organic TikTok views, and a nine-times increase in paying players in just six months. To learn more about Animal Company’s success, check out our recent spotlight.
Wooster Games’ Animal Company offers a case study in building a wide audience with a minimal marketing budget.
Overall, if you design specifically for VR’s strengths and think about retention from day one, there is still a lot of room to innovate for something new and build a growing business on Meta Horizon OS. For more information on Quest gamer segments and optimal design, check out our recent blog posts on user demographics and the 20-40 minute ‘Goldilocks session length.’
Dreaming in VR: Day•Dreamer and Generative AI for Self Exploration
In “Day•Dreamer: How TRIPP x Meta are unlocking GenAI for subconscious exploration in VR,” Nanea Reeves from TRIPP joined Meta to showcase a very different kind of Unity project.
Day•Dreamer is a VR-first app paired with an agentic AI companion, helping people capture dreams, analyze themes and patterns over time, and visualize those ideas as immersive spaces that they can explore.
The session walked through the journey from an early idea to a full proof of concept, including:
How TRIPP validated the technical and experiential core of the idea.
How they brought generative AI, spatial audio, VR, and mobile together to turn abstract thoughts into engaging experiences
How they balanced novelty with value to optimize for retention.
For Unity developers, Day•Dreamer served as a blueprint for how to build VR apps that use AI for more than NPCs or content generation. It also showed how you can still find niches when designing for creativity and self discovery in ways that only VR can bring to life.
Accelerating VR development with Meta Horizon Tools and AI
Our final session, “Accelerating VR Development with Meta Horizon Tools and AI,” featured software engineers Joe Paley and Neel Bedekar as they focused squarely on speed and how AI is removing friction to make development more streamlined and frankly, enjoyable.
Software Engineer Neel Bedekar shares the latest AI tooling breakthroughs for Meta Horizon OS.
AI in your apps and through your pipeline
First, Joe and Neel walked through patterns we’re seeing from developers who already use AI in their daily workflow. Inside their experiences, teams are using AI to power dynamic NPCs and companions, enable mixed reality scene understanding, and perform object detection that supports gameplay or real world utility scenarios.
In the build pipeline, AI is already accelerating debugging and scripting, assisting with environment and character creation, streamlining asset rigging and animation, and even helping produce marketing materials, trailers, and store assets.
They also spotlighted early adopters of our Passthrough Camera API (PCA)like Piano Vision,and Pencil - Learn to Draw who are using it to enhance gameplay through AI-enabled object detection and tracking. We also showed how developers have used computer vision and AI to analyze physical environments in real time and guide users through real world tasks.
An example showing Passthrough Camera API being used to support object detection, in this case, plants.
To help you experiment and build smarter, more responsive experiences with this technology, we’re introducing four new Building Blocks:
Passthrough Camera Access gives you the ability to tap into a single front-facing camera to capture images or video, forming the foundation for mixed reality interactions.
Passthrough Camera Feed Visualizer lets you see the left and right camera textures side by side, making it easier to debug and tune binocular views.
Object Detection builds on passthrough by identifying common objects in the player’s environment in real time, enabling experiences that react to the world around the user.
LLM integration allows you to plug in the large language model of your choice so your app can interpret context, respond naturally to players, and unlock intelligent behaviors with minimal setup.
AI-ready documentation and the Horizon OS MCP
A recurring pain point we hear from developers is that AI tools are at odds with traditional documentation. Pages are spread across multiple formats, have a lot of visual noise, are difficult for AI to parse, and often push context windows to their limits.
At Unite, we announced updates that make Meta documentation friendlier for AI agents:
Markdown views on demand: add .md to any build path documentation URL to get a clean version suitable for LLMs.
llms.txt for each build path: a compact AI sitemap listing key docs with short summaries so agents can quickly identify what to fetch.
llms-full.txt: a single entry point that aggregates documentation for easier integration into RAG workflows.
Adding .md to a documentation URL provides a markdown version that’s more suitable for LLMs.
On top of that, we introduced a new Horizon OS MCP (Model Context Protocol) server that brings this context into the LLMs and coding assistants you already use. With MCP, agents can:
Answer Horizon OS-related questions with up to date documentation.
Generate Unity and Android boilerplate code.
Jumpstart prototyping by using text prompts.
Get help with hand tracking integration and common VR patterns.
Analyze performance traces and suggest optimizations.
An example of AI-assisted code generation to add support for hand grab interactions
By linking AI to our suite of development tools, you can achieve streamlined workflows, faster optimizations, and accelerate overall development. Our goal is to be able to ask for “a starter Unity scene that uses passthrough and hand tracking on Meta Quest” and get something usable, then iterate from there.
AI-assisted performance optimization
Performance is a common pain point for VR developers, regardless of which platform they’re building on. Additionally, from what we’ve heard, many teams live inside the Unity profiler and avoid more advanced tools like Perfetto because of the learning curve.
In this session, Joe and Neel demonstrated how an LLM-enabled workflow can sit on top of tools like Perfetto via the MCP server. First, the team captured a performance trace from a complex project like North Star, then fed the trace into the MCP workflow. Next, the AI broke down the main thread and GPU activity, highlighted anomalies, and suggested concrete fixes.
This is a quick overview, but for attendees, this demo illustrated how instead of manually sifting through dense traces, developers can start from a human readable summary and jump directly into the most impactful issues.
AI can now help developers analyze Perfetto traces and identify various issues.
Meta Asset Library and AI generated assets
Shipping a prototype requires functional code, but it also requires assets that are performant and look good enough for playtesting. That’s why we’re paving a path towards faster prototyping with the Meta Asset Library, available in the v83 Meta SDK for Unity, which gives you:
A royalty free, searchable library of hundreds of thousands of assets.
Multiple levels of detail suitable for experiences on Quest
A foundation you can complement with AI-generated assets for non-hero elements in your experience.
By combining the Meta Asset Library with AI Building Blocks and the Horizon OS MCP, you can enable a development loop where you can move from idea to playable prototype in days instead of months.
Getting Hands-On at the Meta Booth
Beyond the stage, the Meta booth at Unite was constantly busy as developers lined up to:
Try out the latest hand tracking improvements in a sample app and see how natural interactions feel when you design for hands first.
Experiment with Building Blocks in guided demos that showed how to drag and drop core features like passthrough, hand tracking, and AI into a Unity project.
Explore Blossom Buddy, bringing together mixed reality, Passthrough Camera Access, and AI into a single sample app.
Play highlights from leading Meta Horizon Store games.
Developers had the chance to get hands-on with new tools at the Meta booth.
For many attendees, this was their first time seeing how far Quest has come in terms of comfort, tracking quality, mixed reality, and visual fidelity. It also helped connect the dots between what they heard in sessions and how those ideas feel in real VR experiences.
Get Started with VR in Unity
Unite 2025 was a reminder of why Unity and Meta Horizon OS fit together so well. With so many developers already having the skills to build compelling 3D experiences in Unity, VR simply adds new possibilities and opportunities through embodiment, presence, and mixed reality. And with Meta’s tools and AI workflows, you can remove as much friction as possible to focus on the part only you can do: execute your unique vision.
If you’re new to Horizon OS or VR and want to take the next step:
Like this recap? To stay up to date with all of the features and tools we covered above, check out our release notes, subscribe to our monthly newsletter, and follow us on X and Facebook. If you have feedback, we’d love to hear from you. Let us know via the Feedback Tool in MQDH.
Get the latest updates from Meta Horizon.
Get access to highlights and announcements delivered to your inbox.
Saydeechan: A Solo Creator’s Journey to Bring Worlds to Japan
Explore the journey of Saydeechan, a creator shaping cultural, emotional, and community-driven worlds in Meta Horizon through art, fashion, and collaboration.