Overview

Note: Internally, we use a right handed coordinate system. In this documentation we use a left handed coordinate system, as most APIs in Unity report poses in a left handed coordinate frame. We plan to clarify this in future releases.

The Meta 2 device has various sensors and algorithms which require the use of different coordinate frames. Below we will explain those which are exposed through the SDK. Understanding these is not necessary for most developers, but it is relevant for those developers wishing to access raw sensor data or augment the Meta 2 with additional tracking or sensor devices.

Figures

The figures below are shown in a left handed coordinate frame (as they would look in Unity).
The measurements provided from the device origin to depth are truncated; more precise measurements can be retrieved via the tranform API.

side_view

front_view

Device Origin

The device origin is located roughly in between where the user’s eyes will be; see figures above. This corresponds to the origin of the MetaCameraRig.

Eyes

The default inter-pupillary distance (IPD) is 63mm. The left eye frame is a translation from the device origin by <-31.5mm, 1mm, -5mm> and the right eye frame is a translation by <31.5mm, 1mm, -5mm>. By default the eyes are rotated 15 degrees about the X axis (looking downwards).

The Eye Calibration Application updates the default eye positions to the positions which result from the calibration procedure. These are applied in the Compositor. Currently, the calibration application only accounts for your IPD, and does not transform on the Y or Z axes.

World

The world frame is a fixed frame with gravity pointing down on the Y axis (in the -Y direction). Its origin corresponds to your position when you hold your head still at the end of SLAM initialization.

Our tracking system estimates your position and rotation with respect to this fixed frame and the MetaCameraRig’s position and orientation are tracked within this frame.

Sensors

Sensors exposed through our APIs report their poses with respect to the world frame at the specific time that this sensor data was captured. This helps to correct for the latency of the captured data. See sensor data for more details.

You can also look up transforms between the sensors and coordinate frames using the transform API (e.g. device origin to depth).

Transform API

Note: The interop function in Meta.Interop.MetaCoreInterop will report a right handed matrix; using the Meta.Plugin.SystemApi wrapper GetTransform() performs the necessary conversion to Unity’s (left) handedness for you.

The GetTransform() API allows you to retrieve poses between different coordinate frames in our system. When asking for poses from/to the WORLD coordinate frame, you will receive the latest tracking data from our tracking system.

Example

var transform = Matrix4x4.identity;
var success = Plugin.SystemApi.GetTransform(Interop.MetaCoreInterop.MetaCoordinateFrame.DEPTH,         // to
                                            Interop.MetaCoreInterop.MetaCoordinateFrame.DEVICE_ORIGIN, // from
                                            ref transform);

For a full example, see MetaApplyTransform.cs. This script can be attached to a game object; it allows you to transform that object from one frame to another.