This page provides information about the Tracking tab in Chaos Arena.
Camera tracking is described by a protocol and additional parameters for the protocol. There are also tracking offsets that specify the center point of the tracking system and the center point of the tracker itself.
The currently supported protocols are:
|
Apart from the 3D camera position and orientation, FOV tracking is also supported for stYpe and EZtrack. It can change the inner frustum size dynamically, according to the current focal length of the camera lens.
The tracking origin in the scene is the pivot point of the projection geometry by default. The actual center point on the stage specified in the tracking system could be different. This is defined by the Tracking Origin Offset parameters in the configuration. Orientation offsets relative to the projection geometry can also be specified if needed. The tracking origin node option can also be used to control the origin in the scene.
The tracking system should ideally provide the position of the camera's entrance pupil point (often referred to as the lens nodal point). The entrance pupil is a virtual point that's usually somewhere inside the lens and corresponds with the correct position of the virtual pinhole camera model. However, some tracking systems provide the camera sensor's position instead or some other point relative to the actual tracker on the camera. This can be corrected using the Camera Offset parameters in the configuration. Orientation offsets of the tracker relative to the camera can also be specified if needed. Note that for stYpe and EZtrack this offset should not be needed.
|
Position and rotation offset of the tracking system relative to the current tracking origin in the configuration (the projection geometry or the tracking origin node if specified).
Тhis is the distance between the Camera Tracking System Origin (usually somewhere on the floor) on one hand and the Tracking Origin Node (if included in the .vrscene containing the geometry of the LED wall) or the pivot point of the wall geometry (if there is no Tracking Origin Node). If the Tracking Origin Node matches the actual tracking system origin perfectly, these offsets should be zero.
Position and rotation offset of the camera's entrance pupil and orientation from the tracker's center point and orientation.
Тhis is the distance between the lens nodal point (usually inside the lens in front of the sensor) and the pivot point of the tracker which depends on the camera tracking solution (stYpe, Mo-Sys, Vicon, etc.)
Auto-Apply – If enabled, changes to the offsets, FOV, or camera ID are applied immediately during rendering without having to press the Apply Configuration button.
The coordinate system used for all offsets is (+X right, +Y forward, +Z up). Distances are in centimeters. Rotations are described as (pan, tilt, roll) in degrees. Positive pan angle corresponds to a pan to the right. Positive tilt angle corresponds to an upwards tilt. A positive roll angle corresponds to a clockwise roll when viewed from behind.
The forward direction for the Camera Offset is the forward (look) direction of the camera.
The forward direction for the Tracking Origin Offset is the forward direction of the tracking system, which is not always pointing towards the screen. It depends on the tracking system and how it's calibrated.
The OptiTrack integration is different from most of the others as it uses a third party library for connecting to a server - normally the OptiTrack Motive application. Most of the other protocols are just connectionless UDP listeners that open a network port and listen for data on it.
Also, OptiTrack is not specifically a camera tracking system; it's a full motion capture system. There are various types of objects tracked - skeletons, rigid bodies, force plates, markers, and so on. Currently, the objects implemented for camera tracking are rigid bodies. Each rigid body has a streaming ID set in the server. This streaming ID is used as a camera ID.
There seems to be a convention in OptiTrack that the coordinate system's forward direction points away from the screen (backward when standing in front of the LED wall, for example), not into the screen as most of the other systems. This usually causes the tilt and roll to be inverted, i.e. an upward tilt is interpreted as a downward tilt. It can be easily corrected by setting Pan 180 degrees for both the Camera Offset and the Tracking Origin Offset. This effectively rotates the entire coordinate system 180 degrees. Note, however, that all other offsets are still relative to the native coordinate system, meaning that the forward direction is looking away from the screen.
|
The Vicon integration is similar to OptiTrack - uses a third party library for connecting to a server - normally, the Shogun application.
Vicon is also a full motion capture system. The subject names are used as camera IDs.
|
LONET 2 contains many fields for camera and lens tracking. Currently, the only data used by Arena is the Camera Transform Data - camera name, position, orientation, and timecode.
The protocol is tested only with the Lightcraft Jetset app.
The protocols are implemented as plugins, and the JSON configuration files store the plugin ID of the selected protocol:
|
In the example below, using the Vive Mars tracking system, the center of the tracker occurs at the base of the Vive tracker, that is why the measurements start from there.
In the example below you can see how the Tracking Origin Offset is used.
The Tracking Origin offset is the distance between the Tracking Origin Node (if included in the .vrscene containing the geometry of the LED wall) and the Camera Tracking System Origin (usually somewhere on the floor). This example assumes that the tracking system's forward direction is pointing towards the screen.