About Meta Wearables Device Access Toolkit

About our developer preview

The developer preview will go live later this year. To get notified, sign up for our interest form.

  • Access to the SDK: Streamline your development process with pre-built libraries and sample apps.
  • Early Access to Documentation: Understand the API architecture, available endpoints, data structures, and best practices. Kick start your development with sample apps and tutorials.
  • Dedicated Testing Tools and Environments: Test applications in a controlled setting to readily identify and resolve issues leveraging dedicated testing tools and environments during the preview.

Developers in markets where AI glasses are not sold will have limited access to some features, including those on the Wearables Developer Center.

Publishing of integrations will be limited to select partners during our developer preview while we focus on building, testing, and gathering your feedback. We expect to release the ability to publish to the broader community in 2026.

We’re exploring opportunities to host bootcamps, hackathons, and other community events around the SDK. Stay connected with us for future announcements and opportunities at developers.meta.com/wearables/notify.

Coming Soon


The Meta Wearables Device Access Toolkit is coming soon. This FAQ is provided in advance to help you prepare for the upcoming developer preview. Please check back soon for updates on how to access the toolkit and begin development.

The Wearables Device Access Toolkit enables developers to build mobile apps that connect to our AI glasses hardware sensors. You can create hands-free experiences, such as livestreaming, video capture, and AI-assisted applications.

We plan to support our entire portfolio of AI glasses. Developers will initially have camera access via the toolkit and be able to access the microphone and speakers through iOS or Android Bluetooth profiles.
  • Ray-Ban Meta
  • Oakley Meta HSTN
  • Oakley Meta Vanguard
  • Meta Ray-Ban Display

Wearables Device Access Toolkit supports the Android and iOS mobile platforms, with the same OS version requirements as the Meta AI app (Android 10 and iOS 15.2/Swift6). Supported AI glasses devices must run compatible firmware, which will be detailed in our documentation when the preview goes live later this year.

Yes, sample apps with AI glasses integrations as well as tutorial materials will be provided to help developers get started quickly.

You can start development using the SDK with simulated devices via Mock Device Kit, allowing testing without hardware. For full integration, a compatible device is recommended.

Full documentation will be available when developer preview goes live later this year, at developers.meta.com/wearables.

The SDK will have a Mock Device Kit that can be used for automated testing and app testing without needing a physical device paired to your phone.

Core Features & Capabilities

Initially, you will have camera access via the toolkit, and you can access the microphone and speakers on our AI glasses via iOS/Android Bluetooth profiles. We'll continue to listen to feedback from the developer community as we improve the experience and evaluate additional capabilities to build longer-term roadmaps.

Accessing Meta AI capabilities via “Hey Meta” invocations will not be a part of our toolkit this year, but we’ll continue to listen to feedback from the developer community and improve the experience. Meanwhile, you can explore AI model integration in the Llama developer center.

Access to display capability is currently out of scope for the preview release. We’re tracking developer interest and feedback closely as we plan upcoming iterations. You will still be able to access the camera sensors of Meta Ray-Ban Display glasses via the Meta Wearables Device Access Toolkit and access the audio and mic via Bluetooth.

No, not during the preview release. But we’ll continue to listen to feedback from the developer community to improve the experience.

You can access the device's microphones to create voice commands in your app, but you won't be able to create custom voice commands for Meta AI. While custom gesture controls like taps and swipes aren't offered, you can listen for standard events like pause, resume, and stop. We're evaluating additional capabilities for the future and will consider feedback as we determine our roadmaps.

Yes, we expect that using AI in your Meta Wearables Device Access Toolkit integration will improve the user experience. You can leverage the Llama API (Llama developer center), either through your own APIs or those provided by third parties.

Integration & APIs

You’ll have access to a core API for connecting your app to a user's device and managing interactions with the device, as well as APIs for specific capabilities like camera access. We'll also be providing an API for simulating a device to aid in testing.

Yes, developers can process data locally or via cloud/edge platforms. However, the Meta AI app must be used to pair your glasses.