This sample demonstrates how to load a 3D scene from Meta Spatial Editor, spawn interactive 3D objects at runtime, and enable physics-based interactions using simple controller input. It shows how to combine scene-based authoring with runtime spawning and Compose UI in a single mixed reality application.
Clone the Meta Spatial SDK Samples repository and open the Object3DSample directory in Android Studio. Build the project and deploy to your Meta Quest device using the standard Android development workflow. For build configuration details, see the sample’s README.
Explore the sample
The sample uses a hybrid authoring approach: the 3D environment and object templates live in Meta Spatial Editor, while spawning logic and UI live in Kotlin code.
File / Scene
What it demonstrates
Key concepts
Object3DSampleActivity.kt
Main activity that loads the scene and configures features
When you run the app, you see a virtual collaboration room with a floating panel titled Object Library at chest height. The panel displays a 2x3 grid of thumbnails showing six objects: a robot, a drone, a plant, a desk lamp, an easy chair, and a sculpture. When you point your controller at a thumbnail and click, a copy of that object appears with an overshoot scale-up animation, then falls under gravity and bounces off the room’s surfaces. You can grab spawned objects with your controller (rotation constrained to the Y-axis), and the drone’s propellers spin continuously. Multiple copies of each object can be spawned.
Key concepts
Loading scenes from Meta Spatial Editor
The sample loads a composition exported from Meta Spatial Editor using the glXF format. The activity calls glXFManager.inflateGLXF() with the scene URI and a root entity, then extracts named entities from the loaded composition in the onLoaded callback:
The sample uses scene entities as templates rather than loading 3D files at runtime. When you click a thumbnail, setUpButton() reads the mesh URI and scale from the template entity using getComponent<Mesh>() and getComponent<Scale>(), then creates a new entity with additional components for physics and interaction. This separates asset authoring (in Meta Spatial Editor) from runtime behavior (in Kotlin).
The sample creates entities dynamically using Entity.create() with a list of components. Each spawned object gets a Mesh, Grabbable, Scale, Physics, and Transform component:
Physics(shape = "box", density = 0.1f,
state = PhysicsState.DYNAMIC, dimensions = dimensions)
.applyMaterial(PhysicsMaterial.WOOD)
Notice how the sample sets per-object collision dimensions and applies PhysicsMaterial.WOOD for realistic surface interactions. For more on 3D objects, see the concept reference.
Compose UI panels in 3D space
The sample bridges scene-based panel placement with Compose content. The panel entity is positioned in the Meta Spatial Editor scene with a Panel component referencing @id/library_panel, while the Kotlin code registers the matching Compose content using ComposeViewPanelRegistration. The panel uses SpatialTheme from the Spatial SDK UI Set to match Horizon OS design guidelines.
The drone’s propeller animation is authored in Meta Spatial Editor and exported as part of the glTF asset. At runtime, the sample adds an Animated component to trigger continuous playback:
Experiment with physics: Modify the physics properties in PanelLayout.kt to try different materials (such as metal or rubber), densities, or collision shapes. You can also edit physics properties directly in the Meta Spatial Editor project.
Add new objects: Author new 3D objects in Meta Spatial Editor, add them to the composition scene, and reference them in the panel grid.
Switch to ISDK interactions: Replace the simple controller input with Interaction SDK hand and controller interactions. See the Object3DSampleIsdk sample for a reference implementation.