Highlights from Day 1 at GDC 2026: Hands, Agents, Performance & More

Blog Hero Image
We're back at GDC!
Day 1 of GDC 2026 set a high bar for the week and offered a clear view into where development is headed next on Meta Horizon OS.
Across sessions on hands-first interaction design, agentic AI workflows inside Unity, performance fundamentals, and analytics that improve retention, a consistent theme emerged: Meta is investing in the tools, systems, and best practices that help developers build faster, ship smoother, and learn from real player behavior with more confidence.
In this recap:
Whether you're attending GDC or staying tuned in from home, take a dive below for some of the most important takeaways from day one.

Hands First: Designing for the Controller-Free Future

Thanks in part to major upgrades late last year, hand tracking (Unity | Unreal) is increasingly a key design choice that can shape onboarding, accessibility, and how present a player feels in your app.
Colin Robinson (Product Manager, Input & Interactions at Meta) and Samuel Metters (Lead Developer at Double Jack) framed the design decision in practical terms: choose hands-first when it meaningfully supports your experience and your audience.
When should you design for hands? Colin Robinson and Samuel Metters shared a practical framework for choosing your input method.

A framework for choosing hands, controllers, or both

Robinson shared a simple way to cut through the hype. Hands-first is a strong fit when the interaction model benefits from approachability and natural movement. Controllers remain ideal when your design depends on precision, complex inputs, or interactions that happen outside the headset camera's tracking range.
Hands-first tends to work especially well for:
  • Casual gamers or non-gamers who may find controllers unfamiliar (leisure lovers or social explorers)
  • Experiences that prioritize social presence, comfort, and approachability
  • Mechanics built around surface interactions, physics manipulation, or gesture-driven input
Controllers continue to shine for:
  • High precision and reliability, including competitive gameplay (skill seekers)
  • Sliding locomotion or more complex shooter-style inputs
  • Interactions that frequently occur outside the camera's tracking range
The Interaction SDK UI Set offers Figma and Unity templates for building high-quality, familiar hands-first UI interactions.
A clear recommendation from the session was to keep input flexible. Many teams are finding success by supporting both options from the ground up and letting players choose their preferred method.

The Maestro case study: designing to the strengths of hand tracking

Metters brought the framework to life with Maestro, Double Jack's VR orchestra conducting game. Hand tracking fits the experience because it fits appropriately within the fantastical simulation. Conductors don't hold motion controllers, and Maestro leans into that authenticity.
The team also designed in a way that respects real constraints. Conducting naturally keeps hands in front of the body, which helps tracking reliability. For the moments where tracking is challenged, the solution was not to force perfect fidelity. Instead, the team iterated the interaction until it remained readable and consistent.
Maestro's Fermata mechanic uses a static end pose for more reliable detection.
When designing hand tracking gestures, remember that tracking relies on headset cameras, so gestures work best when hands remain visible and in front of the user. While tracking has improved dramatically, detecting finger poses during rapid motion remains challenging. For mechanics requiring finger tracking, gestures where the hand slows or stops yield more consistent results.
Maestro's Fermata mechanic in the videos above demonstrates this: the gesture ends with the player's fist held static in front of their head, giving cameras time to detect the pose accurately.

Three hands best practices worth grasping

  • Differentiate some, standardize more. Use Meta's Interaction SDK (Unity | Unreal) to avoid rebuilding foundational mechanics like grabbing, then invest creativity in what defines your experience.
  • Commit to embodiment. Games like Gorilla Tag, I Am Cat, and Scary Baboon work because the body is central to the loop.
  • Let affordances teach the player. When objects communicate how they should be used, players learn through action and you can reduce onboarding friction.

AI Joins the Team: Agentic Workflows for Development

If hands-first gameplay is about removing friction for players, our "Accelerating Quest Development with Agentic Workflows" session was about removing friction for developers.
Joseph Carroll (Software Engineering Manager), Dilmer Valecillos (Developer Advocate), Zac Bowling (Software Engineering), and Mike Geig (Principal Advocate at Unity) demonstrated how AI agents are starting to move beyond suggestions and into real project actions inside the Unity editor.
Dilmer Valecillos walks through creating a working game with iterative prompts.

Unity AI Gateway and MCP, plus Quest-focused extensions

At the center of the agentic workflow is Unity's AI Gateway, integrated with the Model Context Protocol (MCP). Together, they enable AI agents to understand project context and perform direct actions in a Unity project, including creating and modifying GameObjects.
We also highlighted some Unity MCP Extensions that provide Horizon OS-specific tools, including:
  • meta_add_camerarig to scaffold a VR camera rig
  • meta_add_interactionrig to set up hand and controller interaction systems
  • meta_add_grabbable to make objects grabbable with correct physics
  • meta_update_android_manifest to configure Quest permissions without manual XML edits

From prompt to playable

Perhaps the most compelling part of the session was how smooth the workflow looked. Valecillos offered Claude Code a single prompt: Add an interaction rig, a grabbable scaled to 10%, activate the XR Simulator, and play the Unity scene.
The agent executed the request by invoking the right tools, updating the scene, and entering play mode. This streamlined functionality seems novel, but it's how we believe developers will increasingly accelerate their development process. When routine setup work gets automated, teams can spend more time on the gameplay decisions that matter.
The session also showcased iterative prompting to build a full hand tracking mechanic (pinch to spawn, charge by pulling back, release to launch). When a shader rendered incorrectly in one eye, a follow-up prompt diagnosed a single-pass instanced stereo issue and produced a targeted fix.
With just a few iterative prompts, the team transformed a basic VR prototype into a fully playable game. Check back soon for the full session recording to see how it all came together.

Immersive Debugger: debugging in the context where VR issues appear

A standout moment was the Immersive Debugger with AI integration, enabling developers to troubleshoot while wearing the headset. The demo used voice-driven assistance to inspect runtime state and adjust behavior in real time, including fixing a D12 die that was not rolling correctly.
For VR, this matters because many issues are easiest to diagnose when you can see and feel them in-context. Bringing debugging into the headset at certain points can shorten iteration cycles and improve decision quality.

AI-ready documentation: preparing for AI-native workflows

Wrapping things up with a recap of some announcements we made last year, the team also shared how we're making documentation easier for AI assistants to consume:
  • Docs available in Markdown
  • llms.txt sitemaps by section
  • llms-full.txt bundles for entire doc sets
  • One-click MCP server installation via Meta Quest Developer Hub for Claude, VS Code, and other assistants

Performance Fundamentals: Shortening Your Path to Release

David Borel and Neel Bedekar from the Meta Developer Platform team focused on a VR truth that many developers recognize quickly: performance is comfort, and it directly impacts both presence and sessions. Their session focused on the targets that matter and a workflow you can operationalize.

Key targets and practical thresholds

Together, they emphasized:
  • 72 FPS minimum on Meta Quest 2, scaling up to 120Hz on Meta Quest 3
  • Keep hitches below 3% (four or more consecutive missed frames can be especially noticeable)
  • 13.9ms frame budget at 72 FPS
They also called out real-world thresholds and situations in which building with headroom early can help you manage stability longer term: roughly 70% CPU, 80% GPU, and around 5GB memory on Quest 3 (about 3.5GB on Quest 2).

MQDH and the AI-assisted performance loop

Meta Quest Developer Hub (MQDH) (Windows | Mac) continues to consolidate profiling essentials, including OVR Metrics, Perfetto traces, Logcat, and ADB. But the notable addition: MQDH now ships with a Perfetto MCP server, connecting AI assistants directly to performance traces.
Why Perfetto? The team called it out as a go-to profiling tool, and for good reason. It gives you quick answers on whether your app is CPU or GPU bound, a full view of every thread running on the system, and the ability to zero in on individual frames when tracking down bottlenecks. It also brings CPU and GPU events together in one timeline, making it much easier to find where you can optimize.
Perfetto gives you a unified view of CPU and GPU performance in one timeline.
The intended workflow is simple and effective: Capture a trace, share it with an assistant, ask a targeted question, and get suggestions grounded in real telemetry. For example: "Why is frame time spiking during combat?"

Shader Binary Cache: The Retention Win You're Missing

First-run shader stutters are retention killers. Every time a new shader variant compiles on-device, players experience jarring hitches that undermine the entire experience.
Shader Binary Cache (SBC) solves this by moving compilation to the cloud. When your QA team or dogfooders play, their compiled shaders get cached server-side. New players download pre-compiled binaries instead of waiting through local compilation.
The impact is dramatic, as the team shared that Asgard's Wrath 2 startup dropped from 7 minutes to 20 seconds. Opting in takes two steps: add a manifest tag and upload debug symbols. We already enable this automatically for top apps, but any developer can opt in today.

Three graphics tools to revisit

The team signaled a few tools that can help studios improve performance and ultimately drive longer sessions with more comfort:
  • Fixed Foveated Rendering (FFR): especially valuable on Quest 2, often delivering meaningful GPU savings by reducing peripheral resolution.
  • Application SpaceWarp (AppSW): render at half refresh rate while synthesizing intermediate frames, creating large performance headroom when it fits the content.
  • Dynamic Resolution: optimize your minimum acceptable floor, then let the system scale up quality when headroom exists. Higher resolution can correlate with engagement improvements.

A five-step workflow for your team

  • Iterate with Link to speed up testing cycles
  • Profile with MQDH so optimization is measurement-driven
  • Use SBC to reduce first-run shader stutters
  • Enable Dynamic Resolution to turn headroom into quality
  • Ship with phased releases (1% to 100%) to catch regressions early

The Data-Driven Advantage: How Analytics Transformed Contractors VR

In one of the most actionable sessions, Chong Ahn (Head of Games Growth and Monetization, Meta) shared a clear message: standard KPIs help you notice problems, but game-specific analytics help you fix them.

Why generic metrics are not enough

Daily Active Users (DAU) and retention can tell you when behavior changes, but they rarely tell you exactly why. Ahn encouraged teams to define:
  • How you segment players
  • What "healthy" behavior looks like for each segment
  • Whether actual behavior matches intended design

A retention investigation in Contractors

Ahn walked through how Caveman Game Studio used telemetry to improve the new player experience in Contractors.
Extraction is core to Contractors’ gameplay loop. Data-driven changes boosted the game’s first-match extraction rate from 15% to over 50%—a major retention win.
Players who extracted more often on Day 0 showed meaningfully higher Day 1 retention, which was intuitive. The real opportunity emerged earlier in the funnel. Only about half of new players played a second match, and by match five the funnel dropped to about 20%. First-match extraction rate was only 15%, even though it was intended to be beginner-friendly.
The team asked a diagnostic question that led directly to fixes: who is killing new players?
Two sources emerged:
  • 40% of deaths came from experienced players.
  • 17% came from bots intended to be forgiving.
Two targeted changes followed:
  • Reduced bot accuracy for the first five matches, reducing bot-caused deaths by 80%.
  • Locked new players into the starter map early, since only 35% were selecting it.
The results:
  • First-match extraction increased from 15% to over 50%.
  • Fifth-match extraction climbed to around 75%.
The lesson here is that custom telemetry turned a broad retention concern into a precise, testable set of product changes.

A practical analytics pipeline

To help you get the most from analytics, Ahn broke the pipeline into three stages:
  • Generation: instrument meaningful events (match start, match end, FTUE engagement, progress, transactions, rewards, IAP) and include metadata that supports diagnosis.
  • Warehousing: route telemetry into a warehouse (BigQuery, Snowflake, Redshift), partition by date and platform, define schemas, and plan for versioning.
  • Transformation and insights: model raw events into decision-ready tables (daily summaries, cohorts) and connect BI tools for exploration.
Of course, each studio will have different available resources to leverage for a pipeline. Ahn emphasized that teams can start small and mature their analytics over time as their player base grows.

Building with less friction, from prototype to retention

Day 1 reinforced one of our core developer-centered goals: remove friction across the stack so teams can iterate faster, ship smoother, and learn more from the players who show up.
Hands-first design makes interaction feel immediate. Agentic workflows reduce setup time and accelerate iteration. Performance tooling makes profiling a repeatable habit. Analytics helps teams turn retention questions into specific fixes and measurable outcomes.
Up next in Day 2: tools to speed up your builds, AI integrations for your workflow, how Meta Horizon Store discovery actually works. Plus, the LiveOps playbook behind Gorilla Tag's success. Stay tuned on X and Facebook for the latest updates and check back soon for full session recordings!
All
Apps
GDC
Games
Hand tracking
Quest
Unity
Unreal
Did you find this page helpful?
Explore more
The Live Service Checklist: Five Pillars for Building Successful Games
Discover the Live Service Checklist with best practices to acquire, engage, retain, monetize, and analyze players in Meta Horizon
Apps, Design, Entertainment, Games, Quest
Smoother Apps, Happier Users: Introducing FrameSync for Meta Horizon OS
Discover FrameSync on Meta Horizon OS: a new frame-timing algorithm that boosts smoothness, cuts stale frames, and lowers motion-to-photon latency.
All, Apps, Games, Quest
A Developer's Guide To Designing For Meta Quest's Four Gamer Segments
We surveyed 4,000 Meta Quest users and found 4 distinct gamer segments. Learn who they are, what they want, and how to design for each
All, Apps, Design, Games, Quest

Get the latest updates from Meta Horizon.

Get access to highlights and announcements delivered to your inbox.