The wait is over: Build for display glasses starting today
Developers have been experimenting and building hands-free experiences on our AI glasses using camera, audio, and voice. Now, we’re excited to offer a way to present information visually, directly in the moment.
We're rolling out access to the display on Meta Ray-Ban Display glasses with two build paths: for mobile apps and now for Web Apps as well, both in developer preview. You can create display experiences using familiar tools, whether you're extending an existing iOS or Android mobile app or building something entirely new.
You can get a head start building and testing display experiences on AI glasses that are worn by people around the world. You don't need to build with a dev kit and imagine what might be possible—these glasses already exist.
Availability is rolling out over the coming weeks, so start building early to help define the future of wearables development. If you run into any issues, let us know by sharing your feedback and reporting bugs through the Developer Center.
Video featuring apps of hands-free experiences on our AI glasses using camera, audio, and voice.
A new input model for AI glasses: gesture controls via Meta Neural Band
Display is only part of the story. How users interact matters just as much.
Meta Ray-Ban Display glasses support gesture controls powered by surface electromyography (EMG) via Meta Neural Band, enabling effortless input through subtle finger and hand movements.
For developers, this opens up a new unique interaction model that doesn’t rely on touchscreens, voice, or capacitive touch. You can design experiences that respond to simple gestures, enabling more discrete, immediate control in real-world contexts without speaking or reaching for the glasses.
What you can build
With access to the display, developers can start exploring a wide range of use cases:
Information overlays
Real-time data displays, like scores or status updates
Micro-apps, utilities, and experimental interactions
Streaming media
Two ways to build for display glasses
Whether you're an iOS, Android, or web developer, you can build for Meta Ray-Ban Display glasses without learning a new language or a new framework. Your existing stack works for both build paths.
The Meta Wearables Device Access Toolkit
We’ve added display capabilities to the Meta Wearables Device Access Toolkit. This native mobile SDK for iOS and Android that lets you extend your existing app onto the glasses display. Using the toolchains you already know—Swift for iOS, Kotlin for Android— you can now add display UI components including text, images, lists, buttons, and video playback. Combined with the existing toolkit capabilities (camera, audio, and display), the Device Access Toolkit gives mobile developers deep hardware integration that no other AI glasses SDK can match.
A new path for developers who want to build standalone experiences from scratch using standard HTML, CSS, and JavaScript. No proprietary framework to learn. No new language. You can build games, transit tools, cooking guides, grocery lists, instrument practice, and more. with access to motion and orientation data, phone GPS, input from the Meta Neural Band and local storage. Because it's web-native, iteration is fast: build and preview in your browser, then deploy to your glasses via a URL. This path is ideal for rapid prototyping, lightweight utilities, and entirely new categories of on-device experiences.
Availability is rolling out starting today, see linked documentation for full details.:
Device Access Toolkit: Check out the GitHub repos for iOS and Android to integrate the SDK, and visit the documentation for more details.
Web Apps: Access the starter kit on GitHub to start building with your favorite AI coding tools including Claude Code, Cursor, GitHub Copilot, and more, and visit the documentation for more details.
Not sure which path is right for you? If you have an existing iOS or Android app and want to extend it onto the glasses display, start with the Device Access Toolkit. If you're building something new — or just want to move fast with web tools — Web Apps is your path.
Even in Developer Preview, you can share Web Apps via password-protected URLs and Device Access Toolkit builds via release channels with up to 100 testers. The developers who build, test, and learn in this early phase will have an outsized impact on what this category becomes.
We can't wait to see what you build. If you’re new to building for our AI glasses, start here: developer.meta.com/wearables.