Develop

Transitioning to 3D development

Updated: Apr 2, 2026
This guide will help you transition from 2D development to 3D VR development with Spatial SDK. It provides an overview of concepts unique to VR and frequently used APIs.

Key terms for Android developers

If you’re new to VR development, here are the most important terms you’ll encounter:
  • Pose: A combination of a position (Vector3 for x, y, z coordinates) and a rotation (Quaternion for w, x, y, z). Poses describe where something is and which direction it faces in 3D space.
  • Quaternion: A mathematical representation of 3D rotation that avoids gimbal lock (a problem where you lose a degree of rotational freedom). You don’t need to understand the math — use helper methods like Quaternion.lookRotation() to create them.
  • Reference space: Defines the origin point for tracking. LOCAL_FLOOR places the origin at floor level beneath the user’s initial position, which is the most common choice.
  • Entity: A lightweight identifier that groups related components together in the ECS architecture. See ECS overview.
  • Component: A data container attached to an entity (for example, Transform, Mesh, Material). Components define what an entity is, not what it does.
  • System: Code that runs each frame and processes entities with specific components. Systems define behavior.
  • Panel: A 2D UI surface (built with Android Views or Jetpack Compose) that is positioned in 3D space.
  • Passthrough: A mode where the headset’s cameras show the real world with virtual content overlaid on top, creating a mixed reality experience.
  • IBL (Image-Based Lighting): A technique that uses an environment map image to light the scene realistically, simulating how light bounces off surfaces in the real world.

The 3D environment

In VR, the immersive scene is the 3D space that the player can see and interact with while they’re wearing the headset. In Spatial SDK, you create and edit scenes using Spatial Editor. A scene can either completely hide the user’s real world surroudings, or use passthrough mode to keep their surroundings visible while overlaying virtual content onto it.

Responding to the user’s actions

In 2D development, the user interacts with your app through a physical screen. The screen detects and handles user input, and the app responds accordingly. Panels in 3D space fulfill the same purpose in VR, but there is an additional point of interaction: the motion of the user. Because the person wearing the headset can freely look and move around the immersive scene, you need to know where they’re looking, the location of their head, body, and hands in the world, and what action they’re performing. Spatial SDK has multiple APIs to provide that information. Tracking data is often provided as a pose, the combination of a transform (x,y,z) and a quaternion (w, x, y, z) for rotation.
  • Controller API: Provides tracking data about both physical controllers and hands.
  • AvatarBody API and AvatarAttachment: Provides body tracking data.
    This code sample uses the Controller and AvatarBody APIs to detect if the user is pressing the menu button on their left controller.
      val leftController =
        Query.where { has(AvatarBody.id) }.filter { isLocal() and by(AvatarBody.isPlayerControlledData).isEqualTo(true) }
          .eval()
          .first()
          .getComponent<AvatarBody>()
          .leftHand
          .tryGetComponent<Controller>()
    
      if (leftController != null && leftController.getPressed(ButtonBits.ButtonMenu)) {
          onMenuButtonPressed()
      }
    
    
      if (leftController != null && leftController.getReleased(ButtonBits.ButtonMenu)) {
          Log.i("MrukInputSystem", "Menu button released")
      }
    
      /** returns true if the button was pressed this frame */
      fun Controller.getPressed(button: Int): Boolean {
      return (buttonState and changedButtons and button) != 0
      }
    
      /** returns true if the button was released this frame */
      fun Controller.getReleased(button: Int): Boolean {
          return buttonState and button == 0 && changedButtons and button != 0
      }
    
  • Scene API: Provides controller and head pose data as well as multiple methods to affect the overall scene.
    This code sample uses the Scene API to adjust the environment and move the user to the world origin.
      class MyActivity : AppSystemActivity() {
          override fun onSceneReady() {
              super.onSceneReady()
    
              // Set the reference space for tracking
              scene.setReferenceSpace(ReferenceSpace.LOCAL_FLOOR)
    
              // Enable/disable passthrough for MR mode
              val isMrMode = scene.isSystemPassthroughEnabled()
    
              // Set view origin position
              scene.setViewOrigin(0f, 0.0f, 2.0f, 0.0f)
    
              // Update environment lighting
              scene.updateIBLEnvironment("chromatic.env")
          }
      }
    

Make your UIs respond to user movement

On a mobile device, your app’s UI is always easily available to the user. To use an app, they simply look at the device’s screen. However, an immersive scene is in 3D, and the user can move and look around it freely. Since your app’s UI appears as a flat panel fixed in the immersive scene, users might walk past it or ignore it. To fix this, you can track the user’s gaze and update the UI position to keep it within their field of view.

Getting head position

This function shows how to find and retrieve the user’s head position in VR space. The code searches for the head attachment component and returns its current 3D position and rotation as a Pose object.
// Function to get the pose of the user's head
fun getHeadPose(): Pose {
  val head =
      Query.where { has(AvatarAttachment.id) }
          .eval()
          .filter { it.isLocal() && it.getComponent<AvatarAttachment>().type == "head" }
          .first()
  return head.getComponent<Transform>().transform
}