Rendering and tracking are provided within Unity via the MetaCameraRig prefab. This prefab utilizes the Meta Compositor for rendering. The prefab provides the various cameras, script components, and other GameObjects that the Meta 2 headset needs to view and interact with content.

Your Unity scene must contain exactly one active instance of the MetaCameraRig prefab in order for rendering and tracking to work.

MetaCameraRig

To add it to your scene, follow these instructions.

By default, the Meta camera position is reset to <0,0,0> whenever you run the scene. If you want to pre-position the camera at a location other than the origin, you must nest the MetaCameraRig under another GameObject. Follow these instructions.

MetaCameraRig Components

The following is a summary of the primary components of the MetaCameraRig that a developer might wish to modify. Advanced developers are encouraged to explore the other components for a fuller understanding of how the MetaCameraRig functions.

Tracking

The root of the MetaCameraRig contains events for the SLAM system lifecycle. See here for more information about SLAM.

Slam Localizer

Compositor

The root of the MetaCameraRig also provides controls for the Meta Compositor.

From this component, you can adjust the 2D Warp and 3D Warp systems. These systems use predictive techniques to reduce the perceived latency of rendering as you move your head through the virtual scene.

You can also toggle depth occlusion. When depth occlusion is enabled, real world objects detected by the Meta 2’s depth sensor occlude AR objects which are spatially behind (i.e. “blocked by”) the real world objects. Turning this feature off prevents virtual objects from being occluded by real world objects.

Note: By disabling depth occlusion, AR objects which are spatially behind physical objects will still appear “on top of” the real world objects. This generates mismatched depth cues for the user’s optical system, which can be an annoyance for users.

Meta Compositor

Primary Cameras

The MetaCameraRig employs several cameras. The LeftCamera and RightCamera objects are used in a similar manner to how a normal Unity Camera is used in a typical Unity scene.

Note: The near and far clipping plane values on these cameras are not used by the Compositor for rendering. You can adjust the clipping plane distances within our Compositor using the sliders provided on the Meta Compositor component (see here for details).

VirtualWebCam

The VirtualWebCam allows you to see (and record) a simulation of what the user is seeing through the Meta 2. It combines the video feed from the Meta 2’s forward-facing color camera with the virtual scene that is being rendered.

Meta Webcam Component

See Meta Webcam (Meta Mirroring) for more information on how to record video from the Meta 2, and here for information on the Virtual Webcam for developers, including how to disable the Webcam.

Warning: To disable the Virtual Webcam, uncheck the checkbox on the Meta Compositor component at the root of the MetaCameraRig. Do not disable the VirtualWebcam GameObject itself.

Tracking

You can also control whether SLAM starts in limited (rotation-only) or full tracking mode. See here for details.

You can also control whether SLAM attempts to reinitialize based on data from a previous launch of the application. See here for details.

Surface Reconstruction

You can enable surface reconstruction by checking the corresponding Surface Reconstruction Active checkbox on the Environmental Configuration script component attached to the EnvironmentalInitialization GameObject, which itself is a child of the MetaCameraRig. See here for details.