How to Verify Hand-Eye Calibration
After completing hand-eye calibration, you can use several methods to verify that the resulting transformation matrix is correct and within the accuracy requirements.
Verification by Residuals
Good Hand-Eye Calibration Residuals are an indication that the hand-eye calibration has successfully converged to a solution.
For smaller robots, we expect the residuals to in the sub-millimeter range, while for larger robots, the residuals can be in the millimeter range. As for the rotational component of the residuals, we expect them to be in the sub-degree range.
Note
Good residuals do not guarantee that the hand-eye calibration is accurate. Moreover, residuals are not a direct measure of the expected accuracy of the robot’s positioning. Therefore, it is important to verify the hand-eye calibration using other methods.
Verification by Touch Test (Recommended)
The recommended hand-eye calibration verification approach is to place the Zivid Calibration Object in the scene and perform the touch test with the robot. This method consists of capturing and detecting the calibration object, computing its pose and moving the robot (equipped with a Pointed Hand-Eye Verification Tool) to touch the object at its distinct point. Then it can be visually inspected how well the robot moved to the desired pose by checking how close the tool’s pointy end is to that point. The closer the tool’s pointy end is to the desired point, the better the hand-eye calibration.
Available tools for performing the touch test:
Hand-Eye GUI (Recommended)
Python script
For a detailed walkthrough, see:
Since this is a physical test, and requires a verification end-of-arm tooling, it is not the most accessible method for everyone. Continue reading for simpler verification methods.
Verification by Projection
The Zivid camera has a built-in projector that can project any image onto the physical scene. For the purpose of hand-eye calibration verification, the camera can project an image of circles onto the known checkerboard corners of the Zivid calibration board. This verification method works by using the hand-eye transformation matrix along with known robot poses to calculate where the calibration object should appear in the camera view. The system then projects these predicted positions back onto the physical scene. If the green circles align well with the actual checkerboard corners, the hand-eye calibration is accurate.
To run this verification method, see:
Hand-Eye GUI: Verification by Projection
Verification by Stitching
Stitching the hand-eye calibration results can also be a useful tool to verify the calibration. All point clouds from the dataset can be transformed to the robot base, using the hand-eye calibration transform, and displayed overlapped. This can give us an understanding of how accurate the robot poses and hand-eye transform are. The better the hand-eye calibration, the more the point clouds will overlap, and the clearer the checkerboard pattern will be.
To run this verification method, see:
Hand-Eye GUI: Verification by Stitching
Python sample: verify_hand_eye_with_visualization.py
Next Steps
If your verification was satisfactory, check out:
If you are unsatisfied with the hand-eye calibration results, check out one of the following articles: