Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Camera tracking is described by a protocol and additional parameters for the protocol. There are also tracking offsets that specify the center point of the tracking system , and the center point of the tracker itself.

...

The currently supported protocols are:

Fancy Bullets
typecircle
  • FreeD
  • stYpe HF
  • EZtrack TCD
  • OptiTrack NatNet
  • Vicon
  • LONET 2


Apart from the 3D camera position and orientation, FOV tracking is also supported for stYpeand EZtrack. It can change the inner frustum size dynamically, according to the current focal length of the camera lens.

...

The tracking system should ideally provide the position of the camera's entrance pupil point (often referred to as the lens nodal point). The entrance pupil is a virtual point that's usually somewhere inside the lens and corresponds with the correct position of the virtual pinhole camera model. However, some tracking systems provide the camera sensor's position instead , or some other point relative to the actual tracker on the camera. This can be corrected using theCamera Offset parameters in the configuration. Orientation offsets of the tracker relative to the camera can also be specified if needed. Note that for stYpe and EZtrackthis offset should not be needed.


General

...

Section
Column
width55%50%

Camera ID - ID string of the tracker to be used for the camera tracking. The tracking systems can send data for multiple physical trackers at the same time and this ID can filter them. For stYpe, this should always be "0".

Protocol - the – The tracking system (protocol) to use.

Port – The UDP port on which to listen for tracking data. Required for stYpe, FreeD, EZtrack, LONET 2, not used for OptiTrack, and Viconand LONET 2.

FOV Horizontal field of view override in degrees. Used This option is used to override the FOV value from the tracking system or the scene. By default, it's 0.0, which means that the override is disabled and the value from the tracking system or the scene is used. When disabled, the FOV is taken from the tracking system if it supports FOV tracking and currently provides a valid positive value , or from the main camera exported in the scene.

Recording Directory A directory for the recorded camera tracking Alembic files. By default, it's the Windows %TEMP% directory.

Sync Recording to Animation – When enabled, tracking samples are recorded only during the first animation playback with the exact animation times as rendered. This ensures that the exported Alembic file will match matches the scene animation when imported to the same scene. When disabled, all received samples are recorded with uniform time sampling based on the average FPS of the received data.

Unit Scale – Scaling factor for the position data received from the tracking system. Can be used for the correction of incorrectly calibrated tracking systems or scenes with a wrong scale. Normally it should not be needed.

1 button – Reset the scaling factor to 1.0.

Uniform Scaling – If enabled, the scaling is uniform on all dimensions; otherwise, otherwise it can be specified for each dimension separately.

Horizon Shift – Horizon shift effect for the camera inner frustum. Similar to a vertical lens shift effect.


UI Text Box
typeinfo

The Camera ID field is case sensitve. Please ensure you use the correct capitalization.

Column
width5%


Column
width40%45%

Tracking Origin Offset 

...

Position and rotation offset of the tracking system relative to the current tracking origin in the configuration (the projection geometry or the tracking origin node if specified).

Тhis is the distance between the Camera Tracking System Origin (usually somewhere on the floor) on one hand and the Tracking Origin Node (if included in the .vrscene containing the geometry of the LED wall) or the pivot point of the wall geometry (if there is no Tracking Origin Node). If the Tracking Origin Node matches the actual tracking system origin perfectly, these offsets should be zero.

Camera Offset

...

Position and rotation offset of the camera's entrance pupil and orientation from the tracker's center point and orientation.

Тhis is the distance between the lens nodal point (usually inside the lens in front of the sensor) and the pivot point of the tracker which depends on the camera tracking solution (stYpe, Mo-Sys, Vicon, etc.)


Auto-Apply If enabled, changes to the offsets, FOV, or the camera ID are applied immediately during rendering , without having to press the Apply Configuration button.


Coordinate

...

System

...

The coordinate system used for all offsets is (+X right, +Y forward, +Z up). Distances are in centimeters. Rotations are described as (pan, tilt, roll) in degrees. Positive pan angle corresponds to a pan to the right. Positive tilt angle corresponds to an upwards tilt. Positive A positive roll angle corresponds to a clockwise roll when viewed from behind.

The forward direction for the Camera Offset is the forward (look) direction of the camera.

The forward direction for the Tracking Origin Offset is the forward direction of the tracking system, which is not always pointing towards the screen. It depends on the tracking system and how it's calibrated.

OptiTrack

...

The OptiTrack integration is different from most of the others as it uses a third party library for connecting to a server - normally the OptiTrack Motive application. Most of the other protocols are just connectionless UDP listeners that open a network port and listen for data on it.

Also, OptiTrack is not specifically a camera tracking system, ; it's a full motion capture system. There are various types of objects tracked - skeletons, rigid bodies, force plates, markers, and so on. Currently, the objects implemented for camera tracking are rigid bodies. Each rigid body has a streaming ID set in the server. This streaming ID is used as a camera ID.

Coordinate System

There seems to be a convention in OptiTrack that the coordinate system's forward direction points away from the screen (backwards backward when standing in front of the LED wall, for example), not into the screen as most of the other systems. This usually causes the tilt and roll to be inverted, i.e. an upward tilt is interpreted as a downward tilt. It can be easily corrected by setting Pan 180 degrees for both the Camera Offset and the Tracking Origin Offset. This effectively rotates the entire coordinate system 180 degrees. Note, however, that all other offsets are still relative to the native coordinate system, meaning that the forward direction is looking away from the screen.

Protocol Settings

Fancy Bullets
typecircle
  • ConnectionType Connection type (0 - multicast, 1 - unicast). Default - multicast.

  • ServerCommandPort NatNet server command port. Default - 1510.

  • ServerDataPort NatNet server data port. Default - 1511.

  • LocalAddress IP address of the localhost where the client application is running. Determines the network to use for the connection to the server. Note that having the actual machine IP seems to be very important. Otherwise,

...

  • it connects to the server but doesn't seem to receive data.

  • ServerAddress IP address of the NatNet server application (normally Motive).

  • MulticastAddress Multicast IP address, as specified in the NatNet server. Default - "239.255.42.99".


Vicon

...

The Vicon integration is similar to OptiTrack - uses a third party library for connecting to a server - normally, the Shogun application.

Vicon is also a full motion capture system. The subject names are used as camera IDs.

Protocol Settings

Fancy Bullets
typecircle
  • ConnectionType Connection type (0 - unicast, 1 - multicast). Default - unicast.
  • LocalAddress IP address of the localhost where the client application is running. Determines the network to use for the connection to the server. Used for multicast connections.

  • ServerAddress IP address of the DataStream server application (normally Shogun).

  • ServerPort Port of the Vicon data stream server. Default - 801.

  • MulticastAddress IP address of the multicast group.

  • MulticastPort Port of the multicast group. Default - 44801.


LONET 2 

...

LONET 2 contains many fields for camera and lens tracking. Currently, the only data used by Arena is the Camera Transform Data - camera name, position, orientation, and timecode.
The protocol is tested only with the Lightcraft Jetset app.

...

The protocols are implemented as plugins, and the JSON configuration files store the plugin ID of the selected protocol:

UI Expand
titleJSON plugin IDs
  • "2023120817" stYpe

  • "2023120818" FreeD

  • "2024020512" EZtrack

  • "2024031564" OptiTrack

  • "2024061116" Vicon
  • "2024101562" LONET 2

Example

...

Anchor
CameraOffset
CameraOffset

In the example below, using the Vive Mars tracking system, the center of the tracker occurs at the base of the Vive tracker, that is why the measurements start from there.

Image Added

In the example below you can see how the Tracking Origin Offset is used.

The Tracking Origin offset is the distance between the Tracking Origin Node (if included in the .vrscene containing the geometry of the LED wall) and the Camera Tracking System Origin (usually somewhere on the floor). This example assumes that the tracking system's forward direction is pointing towards the screen.

Image Added