Use Meta XR SDKs to access and handle user input
Updated: Jan 23, 2025
On this page, we provide an overview of the SDKs that enable you to access and handle user input in a Horizon OS app built in Unity.
Apps developed on Meta XR SDKs can access and handle input from a user’s head, hands, face, and voice using Meta Quest headset and Touch controller technology.
Meta XR Interaction SDK includes prefabricated, customizable interaction components that you can leverage when developing interactive experiences for your end users.
Meta XR Movement SDK enables you to incorporate body and face tracking into your app.
Meta XR Voice SDK enables you to bring voice interactions to your app experiences.
Meta XR Haptics SDK enables you to incorporate haptic feedback into applications in order to create immersive and engaging experiences.