Table of Contents

This page gives information on the stage geometry scene in Chaos Arena. 


Overview


Arena relies on a Stage (projection) Geometry scene, which can be created in any DCC with V-Ray support

Creating Stage Geometry Scene


Follow the steps below to create a Stage Geometry .vrscene that can be loaded automatically from the Arena Server application.

Create a projection mesh for the wall.

Model the wall with the correct shape and dimensions so that the mesh matches the physical screen accurately, including the scale. The mesh should be oriented with the "Positive Y" pointing towards the wall for DCCs with Z-up coordinate systems like 3ds Max. For Y-up DCCs like Maya the mesh should be oriented with the "Negative Z" pointing towards the wall.

The mesh can be split into multiple parts, each corresponding to a single render instance (a single machine can run multiple instances on multiple GPUs) and to a single screen region. Connected regions have to form a rectilinear grid to avoid edge artifacts using the overscan and edge blending features. Alternatively, you can use a single mesh and define render regions using the Region option in the Arena Server user interface.

All parts of the mesh should have the same pivot point. It doesn't matter where exactly the pivot point is, but it should be in a fixed known position. A good approach is to have the pivot at ground level in the center of the combined mesh. This is because the camera tracking is applied relative to this position by default unless a separate tracking origin node is used.


Add UV coordinates for the entire wall. The scale of the wall in UV space doesn't matter. Note the UV channel index for the server configuration. By default, Maya and Blender export to channel "0", while 3ds Max and Houdini export to channel "1".

Connected regions should have UV coordinates that are continuous between the connected parts of the wall (without any gaps between the parts) and should have consistent aspect ratios. This requirement is necessary for the overscan and edge blending features, and it's similar to the second UV channel in the Unreal requirements. The mesh can have other UV channels. The correct one is specified in the server configuration.

[Optional] Move the pivot point of the mesh to a known position relative to the wall that will be used as a reference point for the tracking. It can be adjusted later without changing the mesh using the tracking origin offsets in the server configuration. Alternatively, a separate tracking origin node can be used.

[Optional] Split the mesh into multiple parts, one for each render instance, depending on the stage setup. Make sure the UVs are still valid, and the pivot point is still the same. Alternatively, the mesh can be split using the Region option in the server configuration.

A correctly UV unwrapped geometry consisting of more than one piece of geometry should look like in the example (every other mesh is selected in the image).


Environment viewpoint object - add a dummy mesh object (a simple cube or a sphere, for example) and position it according to the desired projection point on the stage. Its position relative to the screen determines the projection point for the environment part of the image.

The object should be positioned somewhere in front of the screen, in the filming space. This point on the physical stage will have the most accurate reflections from the environment parts of the screen, so it's best positioned somewhere near the filmed subjects. Note that it shouldn't intersect the projection mesh as that would produce an invalid projection.

Note the name of the object, as it is needed when setting up the configuration in the Arena Server application.

Using mesh geometry objects is recommended, as they will be exported in the .vrscene, while some dummy objects in the DCCs don't get exported at all.

[Optional] Add a tracking origin node - add a dummy mesh object and position it according to the tracking system's origin (zero point) relative to the screen.
If the projection mesh pivot point already matches the tracking origin, it can be used directly as a tracking origin node.

If there's no tracking origin node in the scene, the projection mesh pivot point and orientation are used for the tracking origin. Note that in Operator instances, the camera projection is not enabled. In order to visualize the tracked camera correctly in Operator instances, a tracking origin node should be specified.

[Optional] Rename the objects to some known names that will be used in the configuration. Note that Maya, by default, appends "Shape" to the nodes that have to be specified in the configuration.

[Optional] Group all objects so they can be moved together easily.

Export the scene to a .vrscene file.