Develop
Develop
Select your platform

Keyboard Overlay Integration (Deprecated)

Updated: Sep 29, 2022
Mobile SDK Deprecation
As of August 31, 2022, Mobile SDK and the VrApi library are no longer supported. Future updates will be delivered through OpenXR extensions and our OpenXR Mobile SDK, not through any new updates to Meta Mobile or PC APIs.
Discontinuing Mobile SDK support means:
  • New apps must use OpenXR unless a waiver is granted.
  • New apps will not have access to Meta Native Mobile APIs, but existing apps can continue using them.
  • No assistance will be provided for creating new apps with Meta Native APIs. You will find recommendations for migrating existing apps to OpenXR in the developer guides.
  • Only critical security, privacy, or safety issues in Meta Native APIs will be addressed.
  • Any testing of Meta Native Mobile will be restricted to automated QA tests, only to ensure core features remain functional.
For more information about this deprecation, see Meta All In on OpenXR: Deprecates Proprietary APIs.
The Quest system shows an on-screen system keyboard when a text field in Home receives focus. For example, the keyboard overlays Home when a user selects the search option.
You can also display the system keyboard on top of your apps to enable user text entry with this familiar and consistent interface. The keyboard overlay provides the same set of capabilities that you see in the system keyboard displayed on Home including voice dictation, smartphone input, and multiple language layouts.
Keyboard Overlay allows immersive apps to request a keyboard input rendered by the VrShell overlay for text input. To use this feature, Native app developers have three different options and workflows:
  • Use reprojected Android UI through Android SurfaceTexture.
  • Use a hidden Android UI and enable the Native app to update from the contents of the hidden field(s).
  • Write a custom wrapper for the Android input stack.
This guide helps you use reprojected Android UI through Android SurfaceTexture, so that Views are rendered to a surface that shares a buffer with an OpenGL texture.

Prerequisites

To utilize the Keyboard Overlay in Native apps you must have Meta Quest or Quest 2 headset and software v27 or higher.

Android Setup

Manifest

The AndroidManifest.xml file requires the following feature to enable keyboard overlay.
  <!-- Enable overlay keyboard -->
  <uses-feature
      android:name="oculus.software.overlay_keyboard"
      android:required="false"
      />
The android:required="false" setting ensures that the app will run even if the feature is missing.

MainActivity

To reproject Android UI, you must create a GL_TEXTURE_EXTERNAL_OES and store the returned GLUint texture name. The following snippet is written in C but you can set it from any OpenGL library that lets you specify the GL_TEXTURE_EXTERNAL_OES target.
  GLuint* t0 = &display->Program.Textures[0];
  GL(glActiveTexture(GL_TEXTURE0));
  GL(glGenTextures(1, t0));
  ALOGV("Texture created, texName %d", *t0);
  GL(glBindTexture(GL_TEXTURE_EXTERNAL_OES, *t0));
  GL(glTexParameteri(GL_TEXTURE_EXTERNAL_OES, GL_TEXTURE_MIN_FILTER, GL_LINEAR));
  GL(glTexParameteri(GL_TEXTURE_EXTERNAL_OES, GL_TEXTURE_MAG_FILTER, GL_LINEAR));
  GL(glTexParameteri(GL_TEXTURE_EXTERNAL_OES, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE));
  GL(glTexParameteri(GL_TEXTURE_EXTERNAL_OES, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE));
  GL(glBindTexture(GL_TEXTURE_EXTERNAL_OES, 0));
Then, you must create Android SurfaceTexture constructed with the texture name you stored previously. Draw the Android UI to SurfaceTexture and sample the texture using the GL_OES_EGL_image_external extension. The following sample code demonstrates the implementation.
static const char VIRTUAL_DISPLAY_FRAGMENT_SHADER[] =
    "#extension GL_OES_EGL_image_external : require\n"
    "#extension GL_OES_EGL_image_external_essl3 : require\n"
    "uniform samplerExternalOES Texture0;\n"
    "uniform vec2 CursorUV;\n"
    "in mediump vec2 fragTexCoord;\n"
    "out lowp vec4 outColor;\n"

    "void main() {\n"
    " vec2 invertedYFragTexCoord = vec2(fragTexCoord.x, 1.0 - fragTexCoord.y);\n"
    " outColor = texture(Texture0, invertedYFragTexCoord);\n"

    " float cursorDistance = distance(CursorUV, invertedYFragTexCoord);\n"
    " float textureWidth = float(textureSize(Texture0, 0).x);\n"
    " if (cursorDistance * textureWidth < 2.0) {\n"
    "   outColor = vec4(1.0 - outColor.r, 1.0 - outColor.g, 1.0 - outColor.b, 1.0);\n"
    " }\n"
    "}\n";
Note: This shader also includes support for a cursor calculated from hit-testing the geometry. You must simulate input to the Android UI by injecting MotionEvents based on your preferred input hit testing.
The Surface buffer data must be copied to the OpenGL texture every update/frame using the SurfaceTexture.updateTexImage() method.
Note:SurfaceTexture does not currently support Vulkan, however you can implement it with VK_ANDROID_external_memory_android_hardware_buffer.

Handle Input

You must simulate input by determining the U and V coordinates of the collision and MotionEvents in the Window on top of the Android UI View hierarchy. Below is an example of hit-testing a ray pointing straight from the controller pose provided by VrApi or OpenXR.
static void ovrVirtualDisplay_HandleInput(
    ovrVirtualDisplay* display,
    const ovrRigidBodyPosef* controllerPose) {
  Triangle triangle;
  memcpy(&triangle[0], &ovrGeometry_QuadVertexPositions[0], sizeof(float) * 3); // upper-left
  memcpy(&triangle[1], &ovrGeometry_QuadVertexPositions[1], sizeof(float) * 3); // lower-left
  memcpy(&triangle[2], &ovrGeometry_QuadVertexPositions[3], sizeof(float) * 3); // upper-right

  ovrVector2f* uv = &display->CursorUV;
  IntersectLineQuad(controllerPose, triangle, &display->InstanceTransform, uv);

  display->Java->Env->CallVoidMethod(
      display->Java->ActivityObject, display->OnMoveMethod, uv->x, uv->y);
  JAVA_CHECK_EXCEPTION(display->Java->Env, "failed to trigger onMove method");
}
The IntersectLineQuad transforms the provided API types into standard 3-vectors of the origin of the ray line, the direction from that point that the line is pointing, the coordinates (in three 3-vector coordinates) of the upper-left triangle, and an out parameter in barycentric coordinates, which it then converts to UV coordinates using an additional ConvertBarycentricToUV function. The latter takes the shortcut of assuming the geometry the Android UI is drawn on is perpendicular to the Z-axis and you will need to update to function anywhere in render space.
// adapted from GLM_GTX_intersect IntersectLineTriangle
static bool IntersectLineQuad(
    const MathCommon::Vector3f& orig,
    const MathCommon::Vector3f& dir,
    const TriangleMathCommon& upperLeftTriangle,
    MathCommon::Vector3f& position) {
  // float Epsilon = std::numeric_limits<float>::min();

  const MathCommon::Vector3f& vert0 = upperLeftTriangle[0];
  const MathCommon::Vector3f& vert1 = upperLeftTriangle[1];
  const MathCommon::Vector3f& vert2 = upperLeftTriangle[2];

  MathCommon::Vector3f edge1 = vert1 - vert0;
  MathCommon::Vector3f edge2 = vert2 - vert0;

  MathCommon::Vector3f Perpendicular = dir.Cross(edge2);

  float det = edge1.Dot(Perpendicular);

  // if (det > -Epsilon && det < Epsilon)
  // return false;
  float inv_det = 1.0f / det;

  MathCommon::Vector3f Tengant = orig - vert0;

  position.y = Tengant.Dot(Perpendicular) * inv_det;
  // if (position.y < 0.0f || position.y > 1.0f)
  // return false;

  MathCommon::Vector3f Cotengant = Tengant.Cross(edge1);

  position.z = dir.Dot(Cotengant) * inv_det;
  // if (position.z < 0.0f || position.y + position.z > 1.0f)
  // return false;

  position.x = edge2.Dot(Cotengant) * inv_det;

  return true;
}

// assumes plane is perpendicular to Z
static ovrVector2f ConvertBarycentricToUV(const MathCommon::Vector3f& barycentricCoords3D) {
  MathCommon::Vector2f uv0{0.0f, 0.0f};
  MathCommon::Vector2f uv1{0.0f, 1.0f};
  MathCommon::Vector2f uv2{1.0f, 0.0f};

  MathCommon::Vector2f result{
      uv0.x * barycentricCoords3D.x + uv1.x * barycentricCoords3D.y + uv2.x * barycentricCoords3D.z,
      uv0.y * barycentricCoords3D.x + uv1.y * barycentricCoords3D.y +
          uv2.y * barycentricCoords3D.z};

  return ovrVector2f{result.x, result.y};
}
Simulate ACTION_HOVER*MotionEvents for motion and use the preferred button input (for example, trigger) to fire simulated ACTION_DOWN and ACTION_UPMotionEvents. The following code demonstrates how to submit simulated input to the Android UI.
  private synchronized void submitMotionEvent(final float u, final float v, final int action) {
    if (!getTestDialogCreated()) {
      return;
    }

    final float width = mTestDialog.getWidth();
    final float height = mTestDialog.getHeight();
    final float majorDimension = width > height ? width : height;
    final float x = u * majorDimension;
    final float y = v * majorDimension;

    final long now = SystemClock.uptimeMillis();
    final MotionEvent event = MotionEvent.obtain(mLastDown, now, action, x, y, 0);
    event.setSource(InputDevice.SOURCE_CLASS_POINTER);

    runOnUiThread(
        new Runnable() {

          @Override
          public void run() {
            mTestDialog.dispatchTouchEvent(event);
            event.recycle();
          }
        });
  }

  public void onTrigger(final float u, final float v, final boolean down) {
    final int action = down ? MotionEvent.ACTION_DOWN : MotionEvent.ACTION_UP;
    if (down) {
      mLastDown = SystemClock.uptimeMillis();
      submitMotionEvent(u, v, action);
    } else {
      submitMotionEvent(u, v, action);
      mLastDown = 0;
    }
  }

  public void onMove(final float u, final float v) {
    if (mLastDown != 0) {
      submitMotionEvent(u, v, MotionEvent.ACTION_MOVE);
    } else {
      submitMotionEvent(u, v, MotionEvent.ACTION_HOVER_MOVE);
    }
  }

Preview Keyboard Overlay

To test or preview the Keyboard Overlay feature, do the following:
  1. Open the app in which you’ve implemented the Keyboard Overlay feature.
  2. Point the cursor to the editable UI text element and see the system Keyboard Overlay.
Did you find this page helpful?
Thumbs up icon
Thumbs down icon