This sample demonstrates how to apply custom GPU shader effects to the passthrough camera feed by using the GPU texture provided by the Passthrough Camera API as input to a fragment shader. It implements a water surface effect that vertically mirrors the camera feed, distorts it with animated waves, and blends it with caustic textures and color tinting. This is useful for developers who want to apply real-time post-processing or visual effects to the camera feed rather than simply displaying or analyzing it.
Retrieve the GPU texture from the Passthrough Camera API for use in custom shaders
Implement a traditional CG vertex/fragment shader that processes the passthrough camera texture
Use normal maps and time-based UV distortion to create animated visual effects
Properly handle stereo rendering for Quest headsets in custom shaders
Blend multiple textures (camera feed, normal map, detail map) to create composite effects
Requirements
Quest 3 or Quest 3S headset
Unity development environment configured for Quest
Physical headset or Meta Horizon Link v2.1+ (XR Simulator is not supported)
For Unity versions, SDK dependencies, and complete build prerequisites, see the sample README. For development environment setup, see the Passthrough Camera API Getting Started Guide.
Get started
Clone or download the Unity-PassthroughCameraApiSamples repository from GitHub. Open the project in Unity, navigate to Assets/PassthroughCameraApiSamples/ShaderSample/, and open the ShaderSample.unity scene. Build and deploy to your Quest 3/3S device using the instructions in the README. When you run the sample, grant camera access permission when prompted, and you see a water surface effect applied to the passthrough camera feed.
Explore the sample
The sample consists of a scene with a water plane, a custom shader, and a manager script that connects the passthrough camera texture to the shader.
File / Scene
What it demonstrates
Key concepts
ShaderSample.unity
Complete scene setup with camera rig, passthrough configuration, and water surface
Scene composition, passthrough building blocks, spatial placement
Scripts/ShaderSampleManager.cs
Runtime connection between camera feed and shader material
Shared prefab containing the PassthroughCameraAccess component
MRUK integration, camera configuration (left camera, 1280x960)
Runtime behavior
When you run the sample on Quest 3/3S, a permission dialog appears requesting headset camera access. Once you grant permission, the debug text changes from “No permission granted.” to “Permission granted.” The passthrough camera feed starts streaming, and you see the real environment around you with a flat water surface plane positioned 4 meters below you. Four cube borders form a pool shape around the plane. On the plane surface, the live passthrough camera feed renders as a vertical mirror reflection. The effect includes animated wave distortion, cyan color tinting, caustic texture overlay, and increased brightness — simulating looking down at a reflective water surface.
Key concepts
Retrieving the GPU texture
Notice how the sample obtains the GPU texture from the PassthroughCameraAccess component (provided by MRUK) and assigns it to the material in ShaderSampleManager.cs:
The script waits for m_cameraAccess.IsPlaying to return true, then assigns the texture once. The GPU texture handle updates its content each frame automatically without additional SetTexture calls. See Passthrough Camera API documentation for details on PassthroughCameraAccess.
UV distortion with normal maps
The fragment shader in ShaderSampleWater.shader samples a normal map texture with UV coordinates animated by _SinTime.y, then uses the normal map’s XY channels to offset the camera texture UVs — creating a scrolling wave distortion effect. This pattern decouples the distortion animation from the camera feed, allowing the waves to move independently of the reflected image.
Vertical mirroring for reflection
After distorting the UVs, the shader inverts the vertical coordinate (distortedUV.y = 1 - distortedUV.y) to flip the camera image upside down, simulating a reflection on a water surface. This simple transformation combined with the wave distortion creates a convincing reflection effect without requiring ray tracing or environment probes.
Stereo rendering support
The shader includes Unity’s stereo rendering macros (UNITY_SETUP_INSTANCE_ID, UNITY_INITIALIZE_VERTEX_OUTPUT_STEREO, UNITY_VERTEX_OUTPUT_STEREO, UNITY_SETUP_STEREO_EYE_INDEX_POST_VERTEX) to ensure the camera texture renders correctly for both left and right eye views on Quest headsets. This is essential for custom shaders processing passthrough camera content. See ShaderSampleWater.shader for the complete stereo implementation.
Extend the sample
Modify the shader parameters (_WaveAmplitude, _NormalOffsetX, _NormalOffsetY, _ReflectIntensity) in the material to experiment with different water effects — try faster waves, stronger distortion, or different color tints
Replace the water normal map and detail textures with other textures to create different visual effects (frosted glass, heat distortion, kaleidoscope patterns)
Combine techniques from the CameraToWorld sample to align the shader effect with specific real-world surfaces using camera intrinsics and extrinsics