Keyboard Tracking sample overview
Updated: May 7, 2026
This sample demonstrates event-driven keyboard detection and bounding volume visualization using Mixed Reality Utility Kit (MRUK) in Unity. Developers learn how to subscribe to trackable events, filter for specific anchor types, and toggle between full passthrough and surface-projected passthrough modes.
- Configure
MRUK for keyboard tracking by enabling KeyboardTrackingEnabled in the MRUK component - Use TrackableAdded and TrackableRemoved events for event-driven anchor detection
- Filter trackables by type using
OVRAnchor.TrackableType.Keyboard - Visualize tracked object bounding volumes with LineRenderer wireframes
- Implement surface-projected passthrough cutouts that show real objects through virtual environments
- Check device keyboard tracking support and scene permissions at runtime
- Meta Quest 3 or Meta Quest 3S
- A Unity development environment configured for Meta Quest
For complete setup instructions, see Getting started with Unity development. For SDK versions and build prerequisites, see the sample README.
Clone the Unity MR Utility Kit Sample repository and open the project in Unity. Navigate to Assets/MRUKSamples/KeyboardTracker/ and open KeyboardTracker.unity. Build and deploy to your Meta Quest device. For detailed build and dependency setup, see the sample README.
| File / Scene | What it demonstrates | Key concepts |
|---|
KeyboardTracker.unity | Main scene with MRUK keyboard tracking configuration | Event wiring, scene hierarchy, OVRCameraRig setup |
Scripts/KeyboardManager.cs | Core controller for trackable lifecycle and passthrough toggling | Event-driven detection, prefab instantiation, passthrough mode switching |
Scripts/Bounded3DVisualizer.cs | Bounding box wireframe and passthrough cutout renderer | VolumeBounds nullable access, LineRenderer configuration, selective passthrough |
Scripts/CameraFollower.cs | Smooth camera-following UI positioning utility | Lerp-based transform tracking, quaternion slerp for rotation |
Scripts/Logger.cs | Runtime capability and permission checker with in-scene log display | TrackerConfiguration API, scene permission check, log aggregation |
Prefabs/KeyboardPrefab.prefab | Instantiated visualization for each detected keyboard | Wireframe outline, passthrough cutout mesh, axes gizmo |
When you run the scene, you see a virtual skybox environment with a floating Canvas displaying instructions and keyboard tracking status. The Logger component checks whether the device supports keyboard tracking and whether scene permission has been granted.
When you place a physical keyboard in view, MRUK fires a TrackableAdded event and the sample instantiates a visualization prefab at the keyboard’s tracked position. In the default surface-projected passthrough mode, you see a box-shaped cutout in the virtual environment showing the real keyboard through a passthrough layer.
Press button A on the controller to toggle to full passthrough mode, which hides the skybox and shows the real world with a colored XYZ axes gizmo marking the keyboard’s position and orientation. When the keyboard leaves the tracked area, MRUK fires TrackableRemoved and the prefab is destroyed.
Event-driven trackable detection
The sample configures the MRUK component with KeyboardTrackingEnabled = true and wires the TrackableAdded and TrackableRemoved UnityEvents to methods in KeyboardManager. This event-driven pattern eliminates polling and ensures immediate response to anchor updates. KeyboardManager filters for keyboards:
if (trackable.TrackableType != OVRAnchor.TrackableType.Keyboard)
return;
See KeyboardManager.cs for the complete event handler implementation.
Bounding volume visualization
Notice how the sample reads the trackable’s VolumeBounds property (a nullable Bounds?) to size the wireframe and passthrough cutout. After confirming the bounds exist, the sample accesses the value:
var box = _trackable.VolumeBounds.Value;
BoxTransform.localScale = box.size;
The LineRenderer uses 10 points to draw 9 line segments forming a wireframe outline of the bounding box. See Bounded3DVisualizer.cs for the complete edge-tracing logic.
Surface-projected passthrough toggle
The sample demonstrates two passthrough modes controlled by a skybox GameObject. When the skybox is active, the user sees a virtual environment with a passthrough cutout at the keyboard’s position. When the skybox is inactive, full passthrough mode shows the real world. Button A toggles the skybox, and Bounded3DVisualizer syncs the passthrough cutout visibility per-frame. This pattern lets users choose between immersive virtual environments with selective passthrough and full see-through mode with spatial anchors. See KeyboardManager.cs for the toggle logic.
Runtime capability checks
Logger demonstrates how to verify keyboard tracking support and scene permissions at startup. The sample uses OVRAnchor.TrackerConfiguration.KeyboardTrackingSupported to check device capability and Permission.HasUserAuthorizedPermission(OVRPermissionsRequester.ScenePermission) to verify scene permission. This pattern helps diagnose why tracking might not work on a given device or session. See Logger.cs for the permission check implementation.
- Modify the prefab to attach interactive UI elements to the tracked keyboard, such as a floating action menu that appears based on hand proximity to the keyboard.
- Change the trackable filter in
KeyboardManager to detect other anchor types (desks, walls, screens) using different OVRAnchor.TrackableType enum values. - Combine this sample’s event-driven detection pattern with the Anchor Prefab Spawner sample to build multi-object mixed reality environments with automatic passthrough cutouts for each physical object.