Calibration

In this section, we will discuss multi-camera calibration.

Sample of multi-camera setup

Set up cameras in different positions

For optimal performance, cameras should be mounted on a rigid structure. What matters is the rigidness of the structure that holds the cameras in their relative position. For demonstration purposes, tripods can be used, but note that re-calibration is required if cameras are moved out of their calibrated position.

In this article, we will focus on a simple two-camera setup.

Showing multi-camera calibration setup with overlapping FOV

Place calibration object in view of cameras

There are a few considerations to make.

  1. The entire calibration object must be visible from every camera.

  2. We must get good quality data of the calibration object.

Calibration object must be visible in overlapping FOV

Currently, Zivid uses a calibration plate. It is a checkerboard that you can buy from the Zivid WebShop or print yourselves (see Calibration Object for more information). This object can only be viewed from one side, which imposes a limitation that is covered in How to Optimize the Placement of the Calibration Object.

We must get good quality data of the calibration object

To capture a good point cloud we first have to control the environment. The two most important considerations are to:

  1. Have the calibration object in the optimal range of the cameras (see Working Distance and Camera Positioning for more details).

  2. Limit the amount of ambient light (see Dealing with Strong Ambient Light for more details).

Check out How To Get Good Quality Data On Zivid Calibration Board.

Execute calibration

Calibration itself simply involves running our calibration software.

Tip

It is recommended to Warm-up the camera and run Infield Correction before running multi-camera calibration. To further reduce the impact of temperature dependent performance factors, enable Thermal Stabilization.

  1. Capture point clouds of calibration object with all cameras.

  2. Perform calibration.

  3. Output Transformation Matrices.

This is all covered by the Multi-camera calibration sample application, which is described in detail in the Multi-camera calibration programming tutorial.

Now you can learn how to utilize the multi-camera calibration output to stitch the point clouds. Alternatively, learn about the theory behind multi-camera calibration.