1.4 | Setting up a Pixotope machine
Video tutorial - Director walkthrough - SETUP (Pixotope 1.1).
Planning a multi-camera or multi-machine production?
Learn more about 1.4 | Single- and multi-machine setups.
Starting Director
- If this is the first time you have started the Director, please refer to Start Director
- Log in using a Live license
- Check to make sure you are in SETUP view (you can see from the drop-down in the top left-hand corner)
Learn more about the different views in Director: 1.4 | Adjusting levels from Director.
Creating a project
- Click "Create a new project"
- Name your project
- Only alphanumeric characters are allowed, and it is recommended that the name be no longer than 20 characters (Unreal Engine restriction)
- Set your project location
- Add it to the project list
- Set it as the current project
All changes under SETUP are saved under the current project. Setting another project to be the current project will load its setup instead.
Learn more about 1.4 | Setting up a Pixotope project
Settings
Under the "Settings" item on the main menu, there are 2 pages: "General" and "Network".
General
This page allows you to change the default settings to suit your production pipeline. These are general settings related to this installation of Director.
View settings can be found in the Windows menu, in the top left-hand corner.
Start Director on startup
This option starts Director automatically when you start up your machine.
Control mode
The control mode determines whose configuration and calibration this Director is allowed to edit.
- This machine
- this Director is only allowed to change its own configuration and calibration
- use when setting up a single machine, or machines in parallel
- All machines
- this Director is allowed to change the configuration and calibration of all machines
- use when setting up multiple machines from one central machine
Learn more about parallel and central setup in multi-machine productions
Projects folder
This shows a list of projects folder paths on your local machine that are being "watched". All projects found are listed on the "Launch" page. This includes all projects in any subfolder. The demo projects folder path, chosen in the installation process, is listed by default.
Adding a project, level or control panel in one of the listed projects folder paths will automatically add them to the "Launch" page.
In multi-machine setups, the current projects folder must have the same local path on all machines.
Add projects folder
You can select a new projects folder path to be watched. All projects found in that folder or its subfolders will be added to the "Launch" page.
Adding a root folder such as C:\ is discouraged.
Remove a projects folder path
Clicking the trash can will remove that project's folder path. All projects within this parent path will be removed from the "Launch" page, unless there is an overlapping parent path.
Adding subfolders that overlap with other projects folder paths can be used to remove parent folders without removing the subfolders.
Default settings
The default settings are used when creating a new project. The ones marked with an * can be overwritten on a per-project basis.
Default Units
Choose the preferred units for the Unreal Engine and the Tracking Server.
Default Compositing Color Space*
Choose the default compositing color space to be used in Configure → Video I/O.
Default Video Input and Output Format*
Choose the default video format to be used in Configure → Video I/O.
Network adapter
Sets the network adapter to be used for communication between Pixotope machines.
Advanced settings
Startup options (Editor/LIVE)
DX 12 mode
This starts the Editor/Engine with the DX 12 flag. To fully enable ray tracing, ensure that:
- Windows 10 is running version 1809 or later
- DirectX 12 is installed
- the latest Nvidia RTX Graphic drivers are installed
- Ray Tracing is turned on in Pixotope Editor Project Settings → Rendering
Debug
If Verbose logging is enabled, Director logs more events than usual. This can help debug errors.
Network
The "Network" page gives you control over your network setup and allows you to check that all the servers (Data hub and Tracking Server) are now up and running.
For an overview, you can use the Status bar, which is present on all pages at the bottom of the Director.
Data hub
The Data hub connects and handles all communication between the Pixotope machines of a virtual production. Each production needs just one Data hub.
Learn more about 1.4 | Single- and multi-machine setups.
Here you can:
- change who runs the Data hub
- restart the Data hub
- disconnect from a remote Data hub
- stop your local Data hub
Tracking Server
The Tracking Server receives and translates the camera and object tracking data and sends it on to the Pixotope Editor.
Here you can:
- add a Tracking Server
- 1 Tracking Server can be added/launched on each connected machine
- restart a Tracking Server
- delete a Tracking Server
Render machines
List of all machines connected to the same Data hub.
Configure
Under the "Configure" item on the main menu, there are several pages. These allow you to configure your Pixotope setup in accordance with your physical studio setup. This includes informing the system about your camera and camera tracking systems, object trackers, SDI video inputs and outputs, video and tracking routing.
Artist license
In an artist license there is no tracking server included. Configuration and calibration of camera systems are therefore not possible from within the Director.
This is how you can manually change the same settings in the Editor:
- Move the tracked camera
- Select the camera root in the world outliner
- Move the camera root
- Change camera settings
- Select the tracked camera in the world outliner
- Select the "CineCameraComponent" from the component list
- Find the settings under "Current Camera Settings"
Import Setup
All configuration settings are stored per project and on each Director machine individually. You can use "Import Setup from ..." to easily import setups from other projects or other machines.
To import a setup into the current project:
- Click "Import Setup from ..." in the "Launch" panel
- Select the machine you want to import a setup from
- Select the project
- Review the camera systems, media inputs and outputs and object tracker groups to import
- Camera systems, media inputs and outputs and object tracker groups with the same name will be overwritten
The "Final result" dialog shows whether any problems have occurred while importing or applying the setup. If there were, please check Configure → Routing and redo the routing.
Camera tracking
Here you configure everything related to camera tracking.
Camera system
A camera system is a camera including its lens and a tracking system. The result is a tracked camera with various parameters tracked depending on the tracking system used.
- Click "Add camera system" and give it a descriptive name
- Choose a "Camera type" from the drop-down menu under "Camera and Lens"
- If you cannot find your camera type, click "Add camera type", choose a name and specify the filmback (sensor) width and height in mm
- Change the default lens aperture if needed
Choose the assigned tracking server
A tracking server can only be controlled from one Director. When choosing to control a tracking server from a second Director, all its previous settings will be overwritten.
Choose the camera tracking protocol your camera tracking system uses
Learn more about Setting up Camera tracking
Object tracking
Here you configure everything related to object tracking.
Object tracker group
An object tracker group is a parent for object trackers which share the same:
- tracking space
- tracking protocol
- tracking server
- Click "Add object tracker group"
- Give it a descriptive name
- Choose the assigned tracking server
-
A tracking server can only be controlled from one Director. When choosing to control a tracking server from a second Director, all its previous settings will be overwritten.
- Learn more about Tracking data routing
-
- Click "Add object tracker"
- Give it a descriptive name
- Change the port if needed
Object tracking
Object tracking protocol
Choose the object tracking protocol your object tracking system uses.
Advanced
The "Advanced" section covers protocol-specific details and how the data should be mapped. When you calibrate tracking, you might have to come back here if the object movement is mapped wrongly.
Learn more about 1.4 | Setting up object tracking
Video I/O
Here you define compositing color space and your video inputs and outputs.
Compositing
Compositing color space
Choose the project's compositing color space.
- Linear space
- Photorealistic compositing
- For HDR and end-to-end color-managed workflows
- Video space (legacy)
- No color management or HDR output
Learn more about compositing color space, color management and linear compositing.
Video input
- Add additional media inputs, if you need untracked video sources in your level. For example, the input for a virtual monitor
- Name your media input
- Optionally override the default settings:
- Choose the input format (resolution and frame rate) of your camera systems and media inputs
- To activate the deinterlacer, choose an input format with the "id" suffix (for example, "HD - 1080id - 50")
- Choose the color profile
- Choose a color space from the selected color profile
- Choose the transfer type
- Choose the input type:
- For a camera system - video signal with tracking data
- Internal Keyer - the video is keyed with Pixotope's internal keyer
- External Keyer Fill/Key - the video is keyed externally and comes in using a Fill and Key signal
- Input Camera - the video is used as is
- For a media input - video signal with no tracking
- Input Media - the video is used as is
- Input Media with Key - the video is keyed externally and comes in using a Fill and Key signal
- For a camera system - video signal with tracking data
- Choose the input format (resolution and frame rate) of your camera systems and media inputs
Best practice for SDR/HDR pipelines
SDR camera → SDR pipeline/display
- Compositing color space: "Video space"
- Color management settings are ignored
- Optional: use tone mapper for 3D graphics
For advanced users
- Compositing color space: "Linear space"
- Video Input: Choose "Rec.709 Input"
- Video Output: Choose "Rec.709 Output"
- Artistic choice: experiment with the Rec.709 alternatives
HDR camera → SDR pipeline/display
- Compositing color space: "Linear space"
- Video Input: Choose the HDR camera's color space
- Video Output: Choose "Rec.709 Output"
- Artistic choice: experiment with the Rec.709 alternatives
HDR camera → Output follows input
- Compositing color space: "Linear space"
- Video Input: Choose the HDR camera's color space
- Video Output: Choose the same color space as set in Video Input
The inverted version of the input color space conversion gets applied, which results in the output color space being exactly the same as the HDR camera's color space
HDR camera → HDR pipeline/display
- Compositing color space: "Linear space"
- Video Input: Choose the HDR camera's color space
- Video Output: Choose the color space of the HDR pipeline/display
Artist with no video I/O
- Compositing color space: "Video space"
Learn more about HDR and Color management and how to create your own custom color transforms
How to work with interlaced video
In simple AR and VS compositing scenarios where the video is not altered except for being composited, interlaced video can be used straight through. Make sure the correct video input format is set in SETUP → Configure → Video I/O (for example, "HD - 1080i - 50").
In more complex scenarios, especially in virtual set scenarios where the virtual camera, reflections and other advanced effects are used, the best results will be achieved if a progressive input signal is used. If a progressive source is not available you can use Pixotope's built-in deinterlacer.
- Go to SETUP → Configure → Video I/O
- In "Video Input Format", choose an input format with the "id" suffix (for example, "HD - 1080id - 50")
For best results, choose an interlaced video format for the output
Video output
- Add one or more media outputs
- Name your media output
- Optionally override the default settings
- Choose the output format (resolution and frame rate) of your camera systems and media inputs
- Choose the color profile
- Choose a color space from the selected color profile
Color management in Editor
When "OCIO Viewport Enabled" is selected in the Editor, the color space of the Editor viewport is linked to this color space selection.
This only applies if SETUP → Configure → Video I/O → Compositing color space is set to "Linear space".
- Choose the output type
- Fill only - for internal compositing
- Fill and Key - for external compositing
- Learn more about Compositing
Routing
On the "Routing" page, we need to recreate the physical routing.
This feature is in Experimental stage and can be accessed via command line.
We support the following DeckLink cards:
- 8K Pro
- ...
Route In/Output via Blackmagic Video Cards
Download this blackmagic.json file
or manually create a file named blackmagic.json with the following content:blackmagic.json
{ "genlock": "CameraCrane1", "input": [ { "id": "CameraCrane1", "manufacturer": "Blackmagic", "source": "DeckLink Input 1" }, { "id": "Media Input 1", "manufacturer": "Blackmagic", "source": "DeckLink Input 2" } ], "output": [ { "id": "Output HD", "manufacturer": "Blackmagic", "source": "DeckLink 8K Pro(3)" }, { "id": "Output Small HD", "manufacturer": "Blackmagic", "source": "DeckLink 8K Pro(4)" } ] }
JS- Save the file in
[Pixotope installation path]\Pixotope Editor\Engine\Binaries\Win64\
- Restart the Pixotope Engine
The settings in the blackmagic.json file take precedence over settings set in the Video IO or Routing panel.
Camera, Media and Object routing
On this tab we set the input routing of
- Camera systems
- Object trackers
- Media inputs
and the output routing of
- Media outputs
Input routing
Under input routing:
- Set the genlock source for each machine
- There should be one common source for the genlock signal
- What is genlock?
A genlock is used to synchronize the camera and tracking data. For each machine, choose one of the following:
- External ref - genlock comes from an external ref signal
- External SDI x - genlock is used from a specified SDI input
- Internal - an internal genlock is generated
- Add all camera systems and media inputs physically routed to a specific machine
- Set the SDI Inputs they are wired to
- Add all object trackers whose tracking should be available on a specific machine
Only one camera system can be added per machine.
Output routing
Under output routing:
- Add all outputs planned for a specific machine
- Set the SDI Output they are wired to
Incoming tracking data routing
The initial incoming tracking data routing is done on the individual camera systems and object tracker groups. On this tab you:
- get a routing overview of all incoming tracking data
- can edit the routing
- Choose which Tracking Server should be assigned to which camera system and object tracker group
- Best practice for:
- camera systems: Run a separate Tracking Server for every camera system on the machine the camera system will be routed to
- object tracker groups: Use the Tracking Server on the machine most of the object trackers will be routed to
- Best practice for:
- Change the port if needed
- On your camera and object tracking system, enter the IP address and port number of the assigned Tracking Server
- This is where your tracking data should be sent to
- For the Ncam system, the IP address and port number should be entered in the Advanced section of the camera tracking protocol
- Check the status field
- Your tracking configuration is set up correctly if it shows incoming data
- You can also check the 1.4 | Network status in the Editor
Calibrate
Under the "Calibrate" item in the main menu, there are 2 pages: "Tracking" and "Syncing".
Tracking
The tracking system is calibrated so it sends position and rotation data of the physical camera or object relative to the studio origin* into the Editor.
Physical camera → "TrackedCamera" in the Editor
Studio origin → "CameraRoot" in the Editor (yellow cone)
* The studio origin is a defined and marked point in your studio. It is practical to make that a point visible to the camera.
For advanced tracking systems
Advanced tracking systems will already give you position and rotation relative to a defined studio origin. For these systems, we suggest doing the tracking calibration on their end.
If you need technical support in calibrating the tracking system, please contact the vendor of your camera tracking system.
For simple tracking systems
Simple tracking systems provide only parts of the data, for example only rotation data. In this case, use the tracking panel to manually offset the tracking data so that you end up getting position and rotation data relative to the studio origin. This data offset is applied in the Tracking Server.
Camera
Depending on the selected camera mount, different offset options are provided:
Lens
Check camera tracking
- Define and mark your studio origin
- In the Editor: check that "CameraRoot" is 0,0,0
- In the Editor: place a calibration cone at the CameraRoot
- The marked point on stage and the calibration cone should match up
- Move the camera to the outer sides of the field of view and see whether the marked point and calibration cone still match up (disregard time slipping)
Syncing
Depending on the video pipeline, you can end up with delayed video or tracking data. To avoid 3D images slipping when the camera or object is moved, we need to synchronize video data and tracking data.
Check syncing
- Pan the physical camera quickly and stop abruptly
- Check to see whether the calibration cone slips temporarily from the marked point
- If the graphic moves first:
- Subtract video input delay
- If that is not possible, then add tracking delay
- If the camera feed moves first:
- Subtract tracking delay
- If that is not possible, then add video input delay
- If the graphic moves first:
Frame matching methods
Buffer size (no frame matching)
This ensures a stable buffer size by deleting overflow, and by duplicating packets when the buffer size is too low.
- Requires manual syncing
- Will cause stuttering when tracking data is unstable
Timecode (experimental)
This method uses timecode to auto lock video and tracking data.
Tracking data with embedded timecode is currently supported by the following tracking systems:
- Ncam
- SMT
This feature is in Experimental stage and can be accessed via command line.
Enable Timecode frame matching method
Add the following lines to
UPROJECT_DIR/Config/DefaultEngine.ini
[SystemSettings] TTMTextureHandler.LtcSourceNumeric = 0 FrameMatcher.Method = 2
TEXT- Restart the Pixotope Engine
Variables
- Video timecode source*:
TTMTextureHandler.LtcSourceNumeric
→ See 1.4 | Useful console commands Frame matching method:
FrameMatcher.Method
Method value Buffer size 0 Timecode 2 Ideal time 3 Tracking delay offset:
FrameMatcher.TimecodeDelayOffset
For most cases, this variable can stay unchanged (default = -2). The tracking delay can be controlled normally from the Calibrate → Syncing panel.
Set variables via command line
All variables are available as console commands from within the Engine. The variable marked with * needs a restart of the video pipeline (e.g. can be achieved via the Routing panel by changing an input spigot back and forth).
Reset to Buffer size frame matching method
- Remove the added lines or set
FrameMatcher.Method = 0
- Restart the Pixotope Engine
Tracking delay
This setting allows you to add a delay to the tracking data.
Video input delay
This setting allows you to add a delay to the video input.
Video output delay
This setting allows you to add a delay to the video output.
Creating your first project
Continue to 1.4 | Setting up a Pixotope project.