As of August 31, 2022, Mobile SDK and the VrApi library are no longer supported. Future updates will be delivered through OpenXR extensions and our OpenXR Mobile SDK, not through any new updates to Meta Mobile or PC APIs.
New apps must use OpenXR unless a waiver is granted.
New apps will not have access to Meta Native Mobile APIs, but existing apps can continue using them.
No assistance will be provided for creating new apps with Meta Native APIs. You will find recommendations for migrating existing apps to OpenXR in the developer guides.
Only critical security, privacy, or safety issues in Meta Native APIs will be addressed.
Any testing of Meta Native Mobile will be restricted to automated QA tests, only to ensure core features remain functional.
The Quest system shows an on-screen system keyboard when a text field in Home receives focus. For example, the keyboard overlays Home when a user selects the search option.
You can also display the system keyboard on top of your apps to enable user text entry with this familiar and consistent interface. The keyboard overlay provides the same set of capabilities that you see in the system keyboard displayed on Home including voice dictation, smartphone input, and multiple language layouts.
Keyboard Overlay allows immersive apps to request a keyboard input rendered by the VrShell overlay for text input. To use this feature, Native app developers have three different options and workflows:
Use a hidden Android UI and enable the Native app to update from the contents of the hidden field(s).
Write a custom wrapper for the Android input stack.
This guide helps you use reprojected Android UI through Android SurfaceTexture, so that Views are rendered to a surface that shares a buffer with an OpenGL texture.
Prerequisites
To utilize the Keyboard Overlay in Native apps you must have Meta Quest or Quest 2 headset and software v27 or higher.
Android Setup
Manifest
The AndroidManifest.xml file requires the following feature to enable keyboard overlay.
The android:required="false" setting ensures that the app will run even if the feature is missing.
MainActivity
To reproject Android UI, you must create a GL_TEXTURE_EXTERNAL_OES and store the returned GLUint texture name. The following snippet is written in C but you can set it from any OpenGL library that lets you specify the GL_TEXTURE_EXTERNAL_OES target.
Then, you must create Android SurfaceTexture constructed with the texture name you stored previously. Draw the Android UI to SurfaceTexture and sample the texture using the GL_OES_EGL_image_external extension. The following sample code demonstrates the implementation.
Note: This shader also includes support for a cursor calculated from hit-testing the geometry. You must simulate input to the Android UI by injecting MotionEvents based on your preferred input hit testing.
The Surface buffer data must be copied to the OpenGL texture every update/frame using the SurfaceTexture.updateTexImage() method.
You must simulate input by determining the U and V coordinates of the collision and MotionEvents in the Window on top of the Android UI View hierarchy. Below is an example of hit-testing a ray pointing straight from the controller pose provided by VrApi or OpenXR.
The IntersectLineQuad transforms the provided API types into standard 3-vectors of the origin of the ray line, the direction from that point that the line is pointing, the coordinates (in three 3-vector coordinates) of the upper-left triangle, and an out parameter in barycentric coordinates, which it then converts to UV coordinates using an additional ConvertBarycentricToUV function. The latter takes the shortcut of assuming the geometry the Android UI is drawn on is perpendicular to the Z-axis and you will need to update to function anywhere in render space.
Simulate ACTION_HOVER*MotionEvents for motion and use the preferred button input (for example, trigger) to fire simulated ACTION_DOWN and ACTION_UPMotionEvents. The following code demonstrates how to submit simulated input to the Android UI.
private synchronized void submitMotionEvent(final float u, final float v, final int action) {
if (!getTestDialogCreated()) {
return;
}
final float width = mTestDialog.getWidth();
final float height = mTestDialog.getHeight();
final float majorDimension = width > height ? width : height;
final float x = u * majorDimension;
final float y = v * majorDimension;
final long now = SystemClock.uptimeMillis();
final MotionEvent event = MotionEvent.obtain(mLastDown, now, action, x, y, 0);
event.setSource(InputDevice.SOURCE_CLASS_POINTER);
runOnUiThread(
new Runnable() {
@Override
public void run() {
mTestDialog.dispatchTouchEvent(event);
event.recycle();
}
});
}
public void onTrigger(final float u, final float v, final boolean down) {
final int action = down ? MotionEvent.ACTION_DOWN : MotionEvent.ACTION_UP;
if (down) {
mLastDown = SystemClock.uptimeMillis();
submitMotionEvent(u, v, action);
} else {
submitMotionEvent(u, v, action);
mLastDown = 0;
}
}
public void onMove(final float u, final float v) {
if (mLastDown != 0) {
submitMotionEvent(u, v, MotionEvent.ACTION_MOVE);
} else {
submitMotionEvent(u, v, MotionEvent.ACTION_HOVER_MOVE);
}
}
Preview Keyboard Overlay
To test or preview the Keyboard Overlay feature, do the following:
Open the app in which you’ve implemented the Keyboard Overlay feature.
Point the cursor to the editable UI text element and see the system Keyboard Overlay.