Introducing the Meta Wearables Device Access Toolkit
Ray-Ban Meta glasses have set a new benchmark in wearable technology, achieving millions in sales worldwide and leading a cultural shift by seamlessly blending iconic fashion with cutting-edge innovation. With their open-ear speakers, hands-free camera capture and on-the-go AI assistant, AI glasses are becoming an integral part of everyday life and making advanced, intuitive devices appealing and accessible to all—introducing a new era where technology and personal style converge.
Given the success we’ve seen with this form factor, we want to deliver a platform where you, as a developer, can build experiences that extend the capabilities of AI glasses to people using your mobile applications.
Join Our Developer Preview
As part of our continued investment in our AI glasses line-up, we’re excited to announce the developer preview of our Meta Wearables Device Access Toolkit, which will be made available later this year. Find our FAQ here.
Our first version of the toolkit will open up access to a suite of on-device sensors— empowering you to start building features within your mobile apps that leverage the hands-free benefits of AI glasses. With our toolkit, you’ll be able to leverage the natural perspective of the wearer and the clarity of open-ear audio and mic access that can:
Offer distinctive hands-free experiences: Design distinctive POV experiences by utilizing the camera that captures the user's viewpoint.
Facilitate Seamless Interaction: Enable hands-free information retrieval and communication, making mobile experiences a more natural extension of the end user.
Broaden Your Mobile App Capabilities: Extend the functionality of your existing mobile applications into the physical world, creating new and exciting use cases that can be unlocked with the glasses form factor.
Shaping the Future of AI glasses
The developer preview is designed for exploration and early development so we can build the future of this toolkit based on your feedback. During the preview, you’ll be able to access the toolkit, build prototypes, test sensor-based experiences, and distribute to testers using our beta testing platform in the Wearables Developer Center. Publishing will be available to limited audiences in the preview phase so that we can responsibly test, learn, and refine our toolkit.
You’ll be able to access a suite of resources designed to support your development journey with AI glasses.
Early Access to the SDK: This toolkit is designed to streamline your development process, offering pre-built libraries and sample apps to get you started.
Early Access to Documentation: Dive deep into the technical specifications and functionalities of Meta Wearables Device Access Toolkit. This early access will allow you to thoroughly understand the API architecture, available endpoints, data structures, and best practices.
Dedicated Testing Tools and Environments: We’ll provide dedicated testing tools and environments during the preview. These resources will allow you to test your applications in a controlled and isolated setting, identifying and resolving any potential issues before general availability. This should contribute to a smoother launch for your products and an optimal user experience.
We’re launching the preview ahead of opening up publishing to general availability in 2026, so you’ll have ample time to familiarize yourself with the platform and test building out unique integrations.
While accessing the Meta AI capabilities of our glasses, including voice commands, isn’t part of our initial developer preview, it’s a key area we’re exploring for future updates. This early release is all about building the right foundation, and that means working closely with our developer community to understand what matters most. Your feedback will directly influence how the platform evolves. Learn more about our upcoming developer preview and toolkit here.
We can’t wait to see what you build. Join us as we shape the next computing platform together.
Explore Developer Spotlights
We're grateful to our select early partners and the integral role they played in shaping our first toolkit designed for AI glasses. Disney Imagineering’s R&D team, for example, has been working on early prototypes for how our AI glasses can help bring easy to access tips while in the parks.
As seen from the demo video above, our valued partners have been a constant source of inspiration. We’re delighted to spotlight some of their testimonials on their experiences with our toolkit below.
We’re just scratching the surface of what’s possible for golfers. Meta Wearables Device Access Toolkit’s camera and voice capabilities allows 18Birdies to deliver real-time yardages, club recommendations, and seamless social capture all while allowing golfers to stay in the moment and focus solely on their golf. The potential to transform how golfers play, learn, and connect through this toolkit is almost limitless.
Meta's wearables aren't just about cool tech. For the blind and low vision community that we build with, they represent an incredibly valuable tool for power and independence. This opportunity further extends the great partnership we have between Meta and Be My Eyes that was instrumental in bringing the first accessibility app to the AI glasses. Going forward, we see this toolkit potentially helping with our deployments into more mutual relationships with enterprise customers - both on the customer service side as well as our upcoming workplace offerings. They offer a chance to improve the lives of the nearly 1 million people who are blind or have low-vision who are on the Be My Eyes platform - a community that grows by more than 16,000 people every month.
We always believed that wearables would be the future of assistive technologies for the visually impaired. Ray-Ban Meta, with its excellent style, comfort, and intuitive user interface, was our most sought-after device. We think this toolkit will open the AI glasses’ possibilities to more developers. We are only just beginning to see a new generation of applications, games, and productivity content that we haven’t even dreamed about yet, by giving developers easy access to AI glasses’ camera sensors and audio input and output!
Louis-Philippe Massè, VP of Innovation and Marketing at HumanWare
Streamlabs exists to help creators do more with less. By integrating the Meta Wearables Device Access Toolkit, we can provide hands-free streaming, dynamic overlays, and multistreaming options, enabling audiences to enjoy richer, more interactive experiences. It expands what’s possible for both creators and viewers. With just a phone and smart glasses, creators can achieve broadcast-quality production from switching perspectives, layering effects, and streaming across platforms in real time. It’s hardware and software converging to unlock truly immersive creator experiences.
Guided by our company's mission to explore emerging technologies that improve human life, we were compelled to try the Meta Wearables Device Access Toolkit after experiencing the current generation of devices, which are the best we've seen in a long time. The decision was solidified when we saw non-technical clients intuitively interact with the wearables and immediately begin brainstorming applications for their own work, making it clear we needed to build on this platform.
Nearly a decade ago, Microsoft launched Seeing AI, a visual assistant created with and for the blind community, to harness the power of artificial intelligence to create a more accessible world. Hands-free experiences have been a top customer request since launch. This is why we're excited to partner with Meta to use the Wearables Device Access Toolkit to pair their glasses with Seeing AI, opening up new possibilities for daily life.
We have a passion for crafting technologically engaging user experiences for our clients. Meta Wearables Device Access Toolkit and integration with AI glasses is an exciting new avenue that we can explore for both project updates, as well as new activations. Our team is looking forward to the innovative products and software that the industry will create with this toolkit! We are most excited to continue exploring the SDK and its applications as more features and capabilities become available with future generations of wearables.
Spencer Evans, Director of Gaming Development at Pixel and Texel
Twitch is all about authentic, real-time content, and IRL streaming is one of our fastest-growing categories. Collaborating with Meta on the AI glasses product was a natural step for us, giving creators new ways to connect with their communities.
The creativity of our partners has inspired us every step of the way and their innovation drives us forward. We’re eager to see what the broader developer community will build next. Fill out our interest form to be among the first to build with the Meta Wearables Device Access Toolkit.
Stay up-to-date
Latest updates delivered to your inbox
Subscribe to our newsletter to keep up with the latest developer updates, releases and more.