How to run and integrate Zivid Hand-Eye Calibration

Goal

Recommended Tool

Guided, no-code workflow

Hand-Eye GUI

Minimal integration example

Programmatic Hand-Eye Calibration

Existing dataset

Hand-Eye GUI

Any robot

Hand-Eye GUI or RoboDK Hand-Eye sample

UR robot

Hand-Eye GUI or UR Hand-Eye sample

RoboDK + Python (Robot-Agnostic)

For those who prefer scripting the process using RoboDK and Python rather than using the GUI, see:

Features:

  • Works with any RoboDK-supported robots

  • Capture poses are manually defined in the .rdk file

  • Fully automated robot control

Universal Robots UR5 + Python

For users who want to use the UR5 driver directly instead of RoboDK or GUI, see:

Features:

  • Designed specifically for UR robots

  • Fully automated robot control

Hand Eye Calibration CLI Tool

For users who prefer a workflow not based on Python or minimal dependencies, see:

This experimental CLI tool computes the hand-eye transform from a provided dataset and saves the transformation matrix and residuals to user-specified files. Use this if you:

  • Already have a dataset (robot poses + point clouds)

  • Want a command-line, batch-style workflow

Installed with:

  • Windows Zivid installer

  • tools deb package on Ubuntu

Programmatic Hand-Eye Calibration

If you prefer a more customized integration, e.g., embedding hand-eye calibration APIs directly into your own solution, follow the step-by-step process outlined below. Use this if you:

  • Want the simplest integration example

  • Are building your own calibration pipeline

Workflow:

  • User inputs robot pose in the form of a 4x4 transformation matrix (manual entry)

  • Camera captures the calibration object

  • User moves the robot to a new capture pose and enters the command to add a new pose

  • First three steps are repeated (typically 10-20 pose pairs)

  • User enters the command to perform calibration and the application returns a Hand-Eye Transformation Matrix

To learn how to integrate hand-eye calibration into your solution, see our interactive code sample below.

Code Sample

Go to source

source

/*
Perform Hand-Eye calibration.
*/

#include <Zivid/Application.h>
#include <Zivid/Calibration/Detector.h>
#include <Zivid/Calibration/HandEye.h>
#include <Zivid/Calibration/Pose.h>
#include <Zivid/Exception.h>
#include <Zivid/Zivid.h>

#include <clipp.h>
#include <iostream>
#include <stdexcept>

namespace
{
    std::string presetPath(const Zivid::Camera &camera)
    {
        const std::string presetsPath = std::string(ZIVID_SAMPLE_DATA_DIR) + "/Settings";

        switch(camera.info().model().value())
        {
            case Zivid::CameraInfo::Model::ValueType::zividTwo:
            {
                return presetsPath + "/Zivid_Two_M70_ManufacturingSpecular.yml";
            }
            case Zivid::CameraInfo::Model::ValueType::zividTwoL100:
            {
                return presetsPath + "/Zivid_Two_L100_ManufacturingSpecular.yml";
            }
            case Zivid::CameraInfo::Model::ValueType::zivid2PlusM130:
            {
                return presetsPath + "/Zivid_Two_Plus_M130_ConsumerGoodsQuality.yml";
            }
            case Zivid::CameraInfo::Model::ValueType::zivid2PlusM60:
            {
                return presetsPath + "/Zivid_Two_Plus_M60_ConsumerGoodsQuality.yml";
            }
            case Zivid::CameraInfo::Model::ValueType::zivid2PlusL110:
            {
                return presetsPath + "/Zivid_Two_Plus_L110_ConsumerGoodsQuality.yml";
            }
            case Zivid::CameraInfo::Model::ValueType::zivid2PlusMR130:
            {
                return presetsPath + "/Zivid_Two_Plus_MR130_ConsumerGoodsQuality.yml";
            }
            case Zivid::CameraInfo::Model::ValueType::zivid2PlusMR60:
            {
                return presetsPath + "/Zivid_Two_Plus_MR60_ConsumerGoodsQuality.yml";
            }
            case Zivid::CameraInfo::Model::ValueType::zivid2PlusLR110:
            {
                return presetsPath + "/Zivid_Two_Plus_LR110_ConsumerGoodsQuality.yml";
            }
            case Zivid::CameraInfo::Model::ValueType::zivid3XL250:
            {
                return presetsPath + "/Zivid_Three_XL250_DepalletizationQuality.yml";
            }
            case Zivid::CameraInfo::Model::ValueType::zividOnePlusSmall:
            case Zivid::CameraInfo::Model::ValueType::zividOnePlusMedium:
            case Zivid::CameraInfo::Model::ValueType::zividOnePlusLarge: break;

            default: throw std::runtime_error("Unhandled enum value '" + camera.info().model().toString() + "'");
        }
        throw std::invalid_argument("Invalid camera model");
    }

    enum class CommandType
    {
        AddPose,
        Calibrate,
        Unknown
    };

    std::string getInput()
    {
        std::string command;
        std::getline(std::cin, command);
        return command;
    }

    CommandType enterCommand()
    {
        std::cout << "Enter command, p (to add robot pose) or c (to perform calibration): ";
        const auto command = getInput();

        if(command == "P" || command == "p")
        {
            return CommandType::AddPose;
        }
        if(command == "C" || command == "c")
        {
            return CommandType::Calibrate;
        }
        return CommandType::Unknown;
    }

    Zivid::Calibration::Pose enterRobotPose(size_t index)
    {
        std::cout << "Enter pose with id (a line with 16 space separated values describing 4x4 row-major matrix) : "
                  << index << std::endl;
        std::stringstream input(getInput());
        float element{ 0 };
        std::vector<float> transformElements;
        for(size_t i = 0; i < 16 && input >> element; ++i)
        {
            transformElements.emplace_back(element);
        }

        const auto robotPose{ Zivid::Matrix4x4{ transformElements.cbegin(), transformElements.cend() } };
        std::cout << "The following pose was entered: \n" << robotPose << std::endl;

        return robotPose;
    }

    std::string markersToString(const std::vector<int> &markerIds)
    {
        std::ostringstream oss;
        for(const auto &id : markerIds)
        {
            oss << id << " ";
        }
        return oss.str();
    }

    void handleAddPose(
        size_t &currentPoseId,
        std::vector<Zivid::Calibration::HandEyeInput> &handEyeInput,
        Zivid::Camera &camera,
        const std::string &calibrationObject,
        const Zivid::Settings &settings)
    {
        const auto robotPose = enterRobotPose(currentPoseId);

        std::cout << "Detecting calibration object in point cloud" << std::endl;
        if(calibrationObject == "c")
        {
            const auto frame = camera.capture2D3D(settings);
            const auto detectionResult = Zivid::Calibration::detectCalibrationBoard(frame);

            if(detectionResult.valid())
            {
                std::cout << "Calibration board detected " << std::endl;
                handEyeInput.emplace_back(robotPose, detectionResult);
                currentPoseId++;
            }
            else
            {
                std::cout << "Failed to detect calibration board. " << detectionResult.statusDescription() << std::endl;
            }
        }
        else if(calibrationObject == "m")
        {
            const auto frame = camera.capture2D3D(settings);

            auto markerDictionary = Zivid::Calibration::MarkerDictionary::aruco4x4_50;
            std::vector<int> markerIds = { 1, 2, 3 };

            std::cout << "Detecting arUco marker IDs " << markersToString(markerIds) << "from the dictionary "
                      << markerDictionary << std::endl;
            auto detectionResult = Zivid::Calibration::detectMarkers(frame, markerIds, markerDictionary);

            if(detectionResult.valid())
            {
                std::cout << "ArUco marker(s) detected: " << detectionResult.detectedMarkers().size() << std::endl;
                handEyeInput.emplace_back(robotPose, detectionResult);
                currentPoseId++;
            }
            else
            {
                std::cout
                    << "Failed to detect any ArUco markers, ensure that at least one ArUco marker is in the view of the camera"
                    << std::endl;
            }
        }
    }

    std::vector<Zivid::Calibration::HandEyeInput> readHandEyeInputs(
        Zivid::Camera &camera,
        const Zivid::Settings &settings)
    {
        size_t currentPoseId{ 0 };
        bool calibrate{ false };

        std::string calibrationObject;
        while(true)
        {
            std::cout
                << "Enter calibration object you are using, m (for ArUco marker(s)) or c (for Zivid checkerboard): "
                << std::endl;
            calibrationObject = getInput();
            if(calibrationObject == "m" || calibrationObject == "c")
            {
                break;
            }
        }

        std::cout << "Zivid primarily operates with a (4x4) transformation matrix. To convert" << std::endl;
        std::cout << "from axis-angle, rotation vector, roll-pitch-yaw, or quaternion, check out" << std::endl;
        std::cout << "our PoseConversions sample." << std::endl;

        std::vector<Zivid::Calibration::HandEyeInput> handEyeInput;
        do
        {
            switch(enterCommand())
            {
                case CommandType::AddPose:
                {
                    try
                    {
                        handleAddPose(currentPoseId, handEyeInput, camera, calibrationObject, settings);
                    }
                    catch(const std::exception &e)
                    {
                        std::cout << "Error: " << Zivid::toString(e) << std::endl;
                        continue;
                    }
                    break;
                }
                case CommandType::Calibrate:
                {
                    calibrate = true;
                    break;
                }
                case CommandType::Unknown:
                {
                    std::cout << "Error: Unknown command" << std::endl;
                    break;
                }
                default: throw std::runtime_error{ "Unhandled command type" };
            }
        } while(!calibrate);
        return handEyeInput;
    }

    Zivid::Calibration::HandEyeOutput performCalibration(
        const std::vector<Zivid::Calibration::HandEyeInput> &handEyeInput)
    {
        while(true)
        {
            std::cout << "Enter type of calibration, eth (for eye-to-hand) or eih (for eye-in-hand): ";
            const auto calibrationType = getInput();
            if(calibrationType == "eth" || calibrationType == "ETH")
            {
                std::cout << "Performing eye-to-hand calibration with " << handEyeInput.size() << " dataset pairs"
                          << std::endl;
                std::cout << "The resulting transform is the camera pose in robot base frame" << std::endl;
                return Zivid::Calibration::calibrateEyeToHand(handEyeInput);
            }
            if(calibrationType == "eih" || calibrationType == "EIH")
            {
                std::cout << "Performing eye-in-hand calibration with " << handEyeInput.size() << " dataset pairs"
                          << std::endl;
                std::cout << "The resulting transform is the camera pose in flange (end-effector) frame" << std::endl;
                return Zivid::Calibration::calibrateEyeInHand(handEyeInput);
            }
            std::cout << "Entered uknown method" << std::endl;
        }
    }
} // namespace

int main(int argc, char *argv[])
{
    try
    {
        std::string settingsPath;
        bool showHelp = false;

        auto cli =
            (clipp::option("-h", "--help").set(showHelp) % "Show help message",
             clipp::option("--settings-path")
                 & clipp::value("path", settingsPath) % "Path to the camera settings YML file");

        if(!clipp::parse(argc, argv, cli) || showHelp)
        {
            auto fmt = clipp::doc_formatting{}.alternatives_min_split_size(1).surround_labels("\"", "\"");
            std::cout << "USAGE:" << std::endl;
            std::cout << clipp::usage_lines(cli, argv[0], fmt) << std::endl;
            std::cout << "OPTIONS:" << std::endl;
            std::cout << clipp::documentation(cli) << std::endl;
            return showHelp ? EXIT_SUCCESS : EXIT_FAILURE;
        }

        Zivid::Application zivid;

        std::cout << "Connecting to camera" << std::endl;
        auto camera{ zivid.connectCamera() };

        if(settingsPath.empty())
        {
            settingsPath = presetPath(camera);
        }
        const auto settings = Zivid::Settings(settingsPath);

        const auto handEyeInput{ readHandEyeInputs(camera, settings) };

        const auto calibrationResult{ performCalibration(handEyeInput) };

        std::cout << "Zivid primarily operates with a (4x4) transformation matrix. To convert" << std::endl;
        std::cout << "to axis-angle, rotation vector, roll-pitch-yaw, or quaternion, check out" << std::endl;
        std::cout << "our PoseConversions sample." << std::endl;

        if(calibrationResult.valid())
        {
            std::cout << "Hand-Eye calibration OK\n"
                      << "Result:\n"
                      << calibrationResult << std::endl;
        }
        else
        {
            std::cout << "Hand-Eye calibration FAILED" << std::endl;
            return EXIT_FAILURE;
        }
    }
    catch(const std::exception &e)
    {
        std::cerr << "\nError: " << Zivid::toString(e) << std::endl;
        std::cout << "Press enter to exit." << std::endl;
        std::cin.get();
        return EXIT_FAILURE;
    }

    return EXIT_SUCCESS;
}
Go to source

source

/*
Perform Hand-Eye calibration.
*/

using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using Zivid.NET.Calibration;
using Duration = Zivid.NET.Duration;

class Program
{
    static int Main(string[] args)
    {
        try
        {
            var userOptions = ParseOptions(args);
            if (userOptions.ShowHelp)
            {
                ShowHelp();
                return 0;
            }

            var zivid = new Zivid.NET.Application();

            Console.WriteLine("Connecting to camera");
            var camera = zivid.ConnectCamera();

            var settingsPath = userOptions.SettingsPath;
            if (string.IsNullOrEmpty(settingsPath))
            {
                settingsPath = PresetPath(camera);
            }
            var settings = new Zivid.NET.Settings(settingsPath);

            var handEyeInput = readHandEyeInputs(camera, settings);

            var calibrationResult = performCalibration(handEyeInput);

            Console.WriteLine("Zivid primarily operates with a (4x4) transformation matrix. To convert");
            Console.WriteLine("to axis-angle, rotation vector, roll-pitch-yaw, or quaternion, check out");
            Console.WriteLine("our PoseConversions sample.");

            if (calibrationResult.Valid())
            {
                Console.WriteLine("{0}\n{1}\n{2}", "Hand-Eye calibration OK", "Result: ", calibrationResult);
            }
            else
            {
                Console.WriteLine("Hand-Eye calibration FAILED");
                return 1;
            }
        }
        catch (Exception ex)
        {
            Console.WriteLine("Error: {0}", ex.ToString());
            return 1;
        }
        return 0;
    }

    static (string SettingsPath, bool ShowHelp) ParseOptions(string[] args)
    {
        string settingsPath = "";
        bool showHelp = false;
        foreach (var arg in args)
        {
            if (arg.StartsWith("--settings-path="))
            {
                settingsPath = arg.Substring("--settings-path=".Length);
            }
            else if (arg.StartsWith("-h") || arg.StartsWith("--help"))
            {
                showHelp = true;
            }
        }

        return (settingsPath, showHelp);
    }

    static List<HandEyeInput> readHandEyeInputs(Zivid.NET.Camera camera, Zivid.NET.Settings settings)
    {
        var handEyeInput = new List<HandEyeInput>();
        var currentPoseId = 0U;
        var beingInput = true;

        var calibrationObject = "";
        while (true)
        {
            Console.WriteLine("Enter calibration object you are using, m (for ArUco marker(s)) or c (for Zivid checkerboard): ");
            calibrationObject = Console.ReadLine();

            if (calibrationObject.Equals("m", StringComparison.CurrentCultureIgnoreCase) ||
                calibrationObject.Equals("c", StringComparison.CurrentCultureIgnoreCase))
            {
                break;
            }
        }


        Interaction.ExtendInputBuffer(2048);

        Console.WriteLine("Zivid primarily operates with a (4x4) transformation matrix. To convert");
        Console.WriteLine("from axis-angle, rotation vector, roll-pitch-yaw, or quaternion, check out");
        Console.WriteLine("our PoseConversions sample.");

        do
        {
            switch (Interaction.EnterCommand())
            {
                case CommandType.AddPose:
                    try
                    {
                        HandleAddPose(ref currentPoseId, ref handEyeInput, camera, calibrationObject, settings);
                    }
                    catch (Exception ex)
                    {
                        Console.WriteLine("Error: {0}", ex.ToString());
                        continue;
                    }
                    break;

                case CommandType.Calibrate: beingInput = false; break;

                case CommandType.Unknown: Console.WriteLine("Error: Unknown command"); break;
            }
        } while (beingInput);
        return handEyeInput;
    }

    public static void HandleAddPose(ref uint currentPoseId, ref List<HandEyeInput> handEyeInput, Zivid.NET.Camera camera, string calibrationObject, Zivid.NET.Settings settings)
    {
        var robotPose = Interaction.EnterRobotPose(currentPoseId);

        Console.Write("Detecting calibration object in point cloud");
        if (calibrationObject.Equals("c", StringComparison.CurrentCultureIgnoreCase))
        {
            using (var frame = camera.Capture2D3D(settings))
            {
                var detectionResult = Detector.DetectCalibrationBoard(frame);

                if (detectionResult.Valid())
                {
                    Console.WriteLine("Calibration board detected");
                    handEyeInput.Add(new HandEyeInput(robotPose, detectionResult));
                    ++currentPoseId;
                }
                else
                {
                    Console.WriteLine("Failed to detect calibration board, ensure that the entire board is in the view of the camera");
                }
            }
        }
        else if (calibrationObject.Equals("m", StringComparison.CurrentCultureIgnoreCase))
        {
            using (var frame = camera.Capture2D3D(settings))
            {
                var markerDictionary = Zivid.NET.MarkerDictionary.Aruco4x4_50;
                var markerIds = new List<int> { 1, 2, 3 };

                Console.WriteLine("Detecting arUco marker IDs " + string.Join(", ", markerIds));
                var detectionResult = Detector.DetectMarkers(frame, markerIds, markerDictionary);

                if (detectionResult.Valid())
                {
                    Console.WriteLine("ArUco marker(s) detected: " + detectionResult.DetectedMarkers().Length);
                    handEyeInput.Add(new HandEyeInput(robotPose, detectionResult));
                    ++currentPoseId;
                }
                else
                {
                    Console.WriteLine("Failed to detect any ArUco markers, ensure that at least one ArUco marker is in the view of the camera");
                }
            }
        }
    }

    static string PresetPath(Zivid.NET.Camera camera)
    {
        var presetsPath = Environment.GetFolderPath(Environment.SpecialFolder.CommonApplicationData)
            + "/Zivid/Settings/";

        switch (camera.Info.Model)
        {
            case Zivid.NET.CameraInfo.ModelOption.ZividTwo:
                {
                    return presetsPath + "Zivid_Two_M70_ManufacturingSpecular.yml";
                }
            case Zivid.NET.CameraInfo.ModelOption.ZividTwoL100:
                {
                    return presetsPath + "Zivid_Two_L100_ManufacturingSpecular.yml";
                }
            case Zivid.NET.CameraInfo.ModelOption.Zivid2PlusM130:
                {
                    return presetsPath + "Zivid_Two_Plus_M130_ConsumerGoodsQuality.yml";
                }
            case Zivid.NET.CameraInfo.ModelOption.Zivid2PlusM60:
                {
                    return presetsPath + "Zivid_Two_Plus_M60_ConsumerGoodsQuality.yml";
                }
            case Zivid.NET.CameraInfo.ModelOption.Zivid2PlusL110:
                {
                    return presetsPath + "Zivid_Two_Plus_L110_ConsumerGoodsQuality.yml";
                }
            case Zivid.NET.CameraInfo.ModelOption.Zivid2PlusMR130:
                {
                    return presetsPath + "Zivid_Two_Plus_MR130_ConsumerGoodsQuality.yml";
                }
            case Zivid.NET.CameraInfo.ModelOption.Zivid2PlusMR60:
                {
                    return presetsPath + "Zivid_Two_Plus_MR60_ConsumerGoodsQuality.yml";
                }
            case Zivid.NET.CameraInfo.ModelOption.Zivid2PlusLR110:
                {
                    return presetsPath + "Zivid_Two_Plus_LR110_ConsumerGoodsQuality.yml";
                }
            case Zivid.NET.CameraInfo.ModelOption.Zivid3XL250:
                {
                    return presetsPath + "Zivid_Three_XL250_DepalletizationQuality.yml";
                }
            default: throw new System.InvalidOperationException("Unhandled camera model: " + camera.Info.Model.ToString());
        }
        throw new System.InvalidOperationException("Invalid camera model");
    }

    static Zivid.NET.Calibration.HandEyeOutput performCalibration(List<HandEyeInput> handEyeInput)
    {
        while (true)
        {
            Console.WriteLine("Enter type of calibration, eth (for eye-to-hand) or eih (for eye-in-hand): ");
            var calibrationType = Console.ReadLine();
            if (calibrationType.Equals("eth", StringComparison.CurrentCultureIgnoreCase))
            {
                Console.WriteLine("Performing eye-to-hand calibration with " + handEyeInput.Count + " dataset pairs");
                Console.WriteLine("The resulting transform is the camera pose in robot base frame");
                return Calibrator.CalibrateEyeToHand(handEyeInput);
            }
            if (calibrationType.Equals("eih", StringComparison.CurrentCultureIgnoreCase))
            {
                Console.WriteLine("Performing eye-in-hand calibration with " + handEyeInput.Count + " dataset pairs");
                Console.WriteLine("The resulting transform is the camera pose in flange (end-effector) frame");
                return Calibrator.CalibrateEyeInHand(handEyeInput);
            }
            Console.WriteLine("Entered unknown method");
        }
    }

    static void ShowHelp()
    {
        Console.WriteLine("Usage: HandEyeCalibration.exe [options]");
        Console.WriteLine("Options:");
        Console.WriteLine("  --settings-path=<path>   Path to the camera settings YML file (default: based on camera model)");
        Console.WriteLine("  -h, --help               Show this help message");
    }
}

enum CommandType
{
    AddPose,
    Calibrate,
    Unknown
}

class Interaction
{
    // Console.ReadLine only supports reading 256 characters, by default. This limit is modified
    // by calling ExtendInputBuffer with the maximum length of characters to be read.
    public static void ExtendInputBuffer(int size)
    {
        Console.SetIn(new StreamReader(Console.OpenStandardInput(), Console.InputEncoding, false, size));
    }

    public static CommandType EnterCommand()
    {
        Console.Write("Enter command, p (to add robot pose) or c (to perform calibration): ");
        var command = Console.ReadLine().ToLower();

        switch (command)
        {
            case "p": return CommandType.AddPose;
            case "c": return CommandType.Calibrate;
            default: return CommandType.Unknown;
        }
    }

    public static Pose EnterRobotPose(ulong index)
    {
        var elementCount = 16;
        Console.WriteLine(
            "Enter pose with id (a line with {0} space separated values describing 4x4 row-major matrix) : {1}",
            elementCount,
            index);
        var input = Console.ReadLine();

        var elements = input.Split().Where(x => !string.IsNullOrEmpty(x.Trim())).Select(x => float.Parse(x)).ToArray();

        var robotPose = new Pose(elements); Console.WriteLine("The following pose was entered: \n{0}", robotPose);
        return robotPose;
    }
}
Go to source

source

"""
Perform Hand-Eye calibration.

"""

import argparse
from pathlib import Path
from typing import List, Tuple

import numpy as np
import zivid
from zividsamples.paths import get_sample_data_path
from zividsamples.save_load_matrix import assert_affine_matrix_and_save


def _options() -> argparse.Namespace:
    """Function to read user arguments.

    Returns:
        Arguments from user

    """
    parser = argparse.ArgumentParser(description=__doc__)

    parser.add_argument(
        "--settings-path",
        required=False,
        type=Path,
        help="Path to the camera settings YML file",
    )

    return parser.parse_args()


def _preset_path(camera: zivid.Camera) -> Path:
    """Get path to preset settings YML file, depending on camera model.

    Args:
        camera: Zivid camera

    Raises:
        ValueError: If unsupported camera model for this code sample

    Returns:
        Path: Zivid 2D and 3D settings YML path

    """
    presets_path = get_sample_data_path() / "Settings"

    if camera.info.model == zivid.CameraInfo.Model.zivid3XL250:
        return presets_path / "Zivid_Three_XL250_DepalletizationQuality.yml"
    if camera.info.model == zivid.CameraInfo.Model.zivid2PlusMR60:
        return presets_path / "Zivid_Two_Plus_MR60_ConsumerGoodsQuality.yml"
    if camera.info.model == zivid.CameraInfo.Model.zivid2PlusMR130:
        return presets_path / "Zivid_Two_Plus_MR130_ConsumerGoodsQuality.yml"
    if camera.info.model == zivid.CameraInfo.Model.zivid2PlusLR110:
        return presets_path / "Zivid_Two_Plus_LR110_ConsumerGoodsQuality.yml"
    if camera.info.model == zivid.CameraInfo.Model.zivid2PlusM60:
        return presets_path / "Zivid_Two_Plus_M60_ConsumerGoodsQuality.yml"
    if camera.info.model == zivid.CameraInfo.Model.zivid2PlusM130:
        return presets_path / "Zivid_Two_Plus_M130_ConsumerGoodsQuality.yml"
    if camera.info.model == zivid.CameraInfo.Model.zivid2PlusL110:
        return presets_path / "Zivid_Two_Plus_L110_ConsumerGoodsQuality.yml"
    if camera.info.model == zivid.CameraInfo.Model.zividTwo:
        return presets_path / "Zivid_Two_M70_ManufacturingSpecular.yml"
    if camera.info.model == zivid.CameraInfo.Model.zividTwoL100:
        return presets_path / "Zivid_Two_L100_ManufacturingSpecular.yml"

    raise ValueError("Invalid camera model")


def _enter_robot_pose(index: int) -> zivid.calibration.Pose:
    """Robot pose user input.

    Args:
        index: Robot pose ID

    Returns:
        robot_pose: Robot pose

    """
    inputted = input(
        f"Enter pose with id={index} (a line with 16 space separated values describing 4x4 row-major matrix): "
    )
    elements = inputted.split(maxsplit=15)
    data = np.array(elements, dtype=np.float64).reshape((4, 4))
    robot_pose = zivid.calibration.Pose(data)
    print(f"The following pose was entered:\n{robot_pose}")
    return robot_pose


def _perform_calibration(hand_eye_input: List[zivid.calibration.HandEyeInput]) -> zivid.calibration.HandEyeOutput:
    """Hand-Eye calibration type user input.

    Args:
        hand_eye_input: Hand-Eye calibration input

    Returns:
        hand_eye_output: Hand-Eye calibration result

    """
    while True:
        calibration_type = input("Enter type of calibration, eth (for eye-to-hand) or eih (for eye-in-hand): ").strip()
        if calibration_type.lower() == "eth":
            print(f"Performing eye-to-hand calibration with {len(hand_eye_input)} dataset pairs")
            print("The resulting transform is the camera pose in robot base frame")
            hand_eye_output = zivid.calibration.calibrate_eye_to_hand(hand_eye_input)
            return hand_eye_output
        if calibration_type.lower() == "eih":
            print(f"Performing eye-in-hand calibration with {len(hand_eye_input)} dataset pairs")
            print("The resulting transform is the camera pose in flange (end-effector) frame")
            hand_eye_output = zivid.calibration.calibrate_eye_in_hand(hand_eye_input)
            return hand_eye_output
        print(f"Unknown calibration type: '{calibration_type}'")


def _handle_add_pose(
    current_pose_id: int,
    hand_eye_input: List,
    camera: zivid.Camera,
    calibration_object: str,
    settings: zivid.Settings,
) -> Tuple[int, List]:
    """Acquire frame and keeps track of the robot's pose id.

    Args:
        current_pose_id: Counter of the current pose in the hand-eye calibration dataset
        hand_eye_input: List of hand-eye calibration dataset pairs (poses and point clouds)
        camera: Zivid camera
        calibration_object: m (for ArUco marker(s)) or c (for Zivid checkerboard)
        settings: Zivid camera settings

    Returns:
        Tuple[int, List]: Updated current_pose_id and hand_eye_input

    """

    robot_pose = _enter_robot_pose(current_pose_id)

    print("Detecting calibration object in point cloud")

    if calibration_object == "c":

        frame = zivid.calibration.capture_calibration_board(camera)
        detection_result = zivid.calibration.detect_calibration_board(frame)

        if detection_result.valid():
            print("Calibration board detected")
            hand_eye_input.append(zivid.calibration.HandEyeInput(robot_pose, detection_result))
            current_pose_id += 1
        else:
            print(f"Failed to detect calibration board. {detection_result.status_description()}")
    elif calibration_object == "m":
        frame = camera.capture_2d_3d(settings)

        marker_dictionary = zivid.calibration.MarkerDictionary.aruco4x4_50
        marker_ids = [1, 2, 3]

        print(f"Detecting arUco marker IDs {marker_ids} from the dictionary {marker_dictionary}")
        detection_result = zivid.calibration.detect_markers(frame, marker_ids, marker_dictionary)

        if detection_result.valid():
            print(f"ArUco marker(s) detected: {len(detection_result.detected_markers())}")
            hand_eye_input.append(zivid.calibration.HandEyeInput(robot_pose, detection_result))
            current_pose_id += 1
        else:
            print(
                "Failed to detect any ArUco markers, ensure that at least one ArUco marker is in the view of the camera"
            )
    return current_pose_id, hand_eye_input


def _main() -> None:
    user_options = _options()
    app = zivid.Application()

    print("Connecting to camera")
    camera = app.connect_camera()

    if user_options.settings_path is None:
        user_options.settings_path = _preset_path(camera)
    settings = zivid.Settings.load(user_options.settings_path)

    current_pose_id = 0
    hand_eye_input = []
    calibrate = False
    while True:
        calibration_object = input(
            "Enter calibration object you are using, m (for ArUco marker(s)) or c (for Zivid checkerboard): "
        ).strip()
        if calibration_object.lower() == "m" or calibration_object.lower() == "c":
            break

    print(
        "Zivid primarily operates with a (4x4) transformation matrix. To convert\n"
        "from axis-angle, rotation vector, roll-pitch-yaw, or quaternion, check out\n"
        "our pose_conversions sample."
    )

    while not calibrate:
        command = input("Enter command, p (to add robot pose) or c (to perform calibration): ").strip()
        if command == "p":
            try:
                current_pose_id, hand_eye_input = _handle_add_pose(
                    current_pose_id, hand_eye_input, camera, calibration_object, settings
                )
            except ValueError as ex:
                print(ex)
        elif command == "c":
            calibrate = True
        else:
            print(f"Unknown command '{command}'")

    calibration_result = _perform_calibration(hand_eye_input)
    transform = calibration_result.transform()
    transform_file_path = Path(Path(__file__).parent / "transform.yaml")
    assert_affine_matrix_and_save(transform, transform_file_path)

    print(
        "Zivid primarily operates with a (4x4) transformation matrix. To convert\n"
        "to axis-angle, rotation vector, roll-pitch-yaw, or quaternion, check out\n"
        "our pose_conversions sample."
    )

    if calibration_result.valid():
        print("Hand-Eye calibration OK")
        print(f"Result:\n{calibration_result}")
    else:
        print("Hand-Eye calibration FAILED")


if __name__ == "__main__":
    _main()

참고

Zivid point clouds are given in millimeters (mm); therefore, ensure that robot input poses for hand-eye calibration also use mm for translation.

The following section is especially useful for handling different orientation representations used by various robot controllers, such as Euler angles, quaternions, or axis-angle.

Helping Tool: Pose Conversions

Zivid primarily operates with a (4x4) Transformation Matrix (Rotation Matrix + Translation Vector). Robot controllers and software typically represent poses as a combination of a position vector and an orientation representation in one of many formats. The Pose Conversions tools help to convert to and from:

  • Axis-Angle

  • Rotation Vector

  • Roll-Pitch-Yaw

  • Euler Angles

  • Quaternion

We offer a couple of tools for robot pose conversions:

Code Sample

Go to source

source

/*
Convert to/from Transformation Matrix (Rotation Matrix + Translation Vector)

Zivid primarily operate with a (4x4) transformation matrix. This example shows how to use Eigen to
convert to and from: AxisAngle, Rotation Vector, Roll-Pitch-Yaw, Quaternion

The convenience functions from this example can be reused in applicable applications. The YAML files for this sample can
be found under the main instructions for Zivid samples.
*/

#include <Zivid/Zivid.h>

#include <Eigen/Core>
#include <Eigen/Dense>
#include <Eigen/Geometry>

#include <iomanip>
#include <iostream>
#include <stdexcept>

namespace
{
    enum class RotationConvention
    {
        zyxIntrinsic,
        xyzExtrinsic,
        xyzIntrinsic,
        zyxExtrinsic,
        nofROT
    };
    constexpr size_t nofRotationConventions = static_cast<size_t>(RotationConvention::nofROT);

    struct RollPitchYaw
    {
        RotationConvention convention;
        Eigen::Array3f rollPitchYaw;
    };

    std::string toString(RotationConvention convention)
    {
        switch(convention)
        {
            case RotationConvention::xyzIntrinsic: return "xyzIntrinsic";
            case RotationConvention::xyzExtrinsic: return "xyzExtrinsic";
            case RotationConvention::zyxIntrinsic: return "zyxIntrinsic";
            case RotationConvention::zyxExtrinsic: return "zyxExtrinsic";
            case RotationConvention::nofROT: break;
            default: throw std::runtime_error{ "Unhandled rotation convention " };
        }

        throw std::invalid_argument("Invalid RotationConvention");
    }

    // The following function converts Roll-Pitch-Yaw angles in radians to Rotation Matrix.
    // This function takes an array of angles, and a rotation convention, as input parameters.
    // Whether the axes are moving (intrinsic) or fixed (extrinsic) is defined by the rotation convention.
    // The order of elements in the output array follow the convention, i.e., for ZYX and zyx, the first
    // element of the array is rotation around z-axis, for XYZ and xyz, the first element is rotation
    // around x-axis.
    Eigen::Matrix3f rollPitchYawToRotationMatrix(const Eigen::Array3f &rollPitchYaw, const RotationConvention &rotation)
    {
        switch(rotation)
        {
            case RotationConvention::xyzIntrinsic:
                return (Eigen::AngleAxisf(rollPitchYaw[0], Eigen::Vector3f::UnitX())
                        * Eigen::AngleAxisf(rollPitchYaw[1], Eigen::Vector3f::UnitY())
                        * Eigen::AngleAxisf(rollPitchYaw[2], Eigen::Vector3f::UnitZ()))
                    .matrix();
            case RotationConvention::zyxExtrinsic:
                return (Eigen::AngleAxisf(rollPitchYaw[2], Eigen::Vector3f::UnitX())
                        * Eigen::AngleAxisf(rollPitchYaw[1], Eigen::Vector3f::UnitY())
                        * Eigen::AngleAxisf(rollPitchYaw[0], Eigen::Vector3f::UnitZ()))
                    .matrix();
            case RotationConvention::zyxIntrinsic:
                return (Eigen::AngleAxisf(rollPitchYaw[0], Eigen::Vector3f::UnitZ())
                        * Eigen::AngleAxisf(rollPitchYaw[1], Eigen::Vector3f::UnitY())
                        * Eigen::AngleAxisf(rollPitchYaw[2], Eigen::Vector3f::UnitX()))
                    .matrix();
            case RotationConvention::xyzExtrinsic:
                return (Eigen::AngleAxisf(rollPitchYaw[2], Eigen::Vector3f::UnitZ())
                        * Eigen::AngleAxisf(rollPitchYaw[1], Eigen::Vector3f::UnitY())
                        * Eigen::AngleAxisf(rollPitchYaw[0], Eigen::Vector3f::UnitX()))
                    .matrix();
            case RotationConvention::nofROT: break;
            default: throw std::runtime_error{ "Unhandled rotation convention" };
        }

        throw std::invalid_argument("Invalid orientation");
    }

    void rollPitchYawListToRotationMatrix(const std::vector<RollPitchYaw> &rpyList)
    {
        Eigen::IOFormat matrixFmt(4, 0, ", ", "\n", "[", "]", "[", "]");
        for(const auto &rotation : rpyList)
        {
            std::cout << "Rotation Matrix from Roll-Pitch-Yaw angles (" << toString(rotation.convention)
                      << "):" << std::endl;
            const auto rotationMatrixFromRollPitchYaw =
                rollPitchYawToRotationMatrix(rotation.rollPitchYaw, rotation.convention);
            std::cout << rotationMatrixFromRollPitchYaw.format(matrixFmt) << std::endl;
        }
    }

    // The following function converts Rotation Matrix to Roll-Pitch-Yaw angles in radians.
    // Whether the axes are moving (intrinsic) or fixed (extrinsic) is defined by the rotation convention.
    // The order of elements in the output array follow the convention, i.e., for ZYX and zyx, the first
    // element of the array is rotation around z-axis, for XYZ and xyz, the first element is rotation
    // around x-axis.
    // Eigen only supports intrinsic Euler systems for simplicity. If you want to use extrinsic Euler systems,
    // Eigen recoments just to use the equal intrinsic opposite order for axes and angles, i.e., axes (A,B,C)
    // becomes (C,B,A), and angles (a,b,c) becomes (c,b,a).
    Eigen::Array3f rotationMatrixToRollPitchYaw(
        const Eigen::Matrix3f &rotationMatrix,
        const RotationConvention &convention)
    {
        switch(convention)
        {
            case RotationConvention::zyxExtrinsic: return rotationMatrix.eulerAngles(0, 1, 2).reverse();
            case RotationConvention::xyzIntrinsic: return rotationMatrix.eulerAngles(0, 1, 2);
            case RotationConvention::xyzExtrinsic: return rotationMatrix.eulerAngles(2, 1, 0).reverse();
            case RotationConvention::zyxIntrinsic: return rotationMatrix.eulerAngles(2, 1, 0);
            case RotationConvention::nofROT: break;
            default: throw std::runtime_error{ "Unhandled rotation convention" };
        }

        throw std::invalid_argument("Invalid rotation");
    }

    std::vector<RollPitchYaw> rotationMatrixToRollPitchYawList(const Eigen::Matrix3f &rotationMatrix)
    {
        const Eigen::IOFormat vectorFmt(4, 0, ", ", "", "", "", "[", "]");
        std::vector<RollPitchYaw> rpyList;
        for(size_t i = 0; i < nofRotationConventions; i++)
        {
            RotationConvention convention{ static_cast<RotationConvention>(i) };
            std::cout << "Roll-Pitch-Yaw angles (" << toString(convention) << "):" << std::endl;
            rpyList.push_back({ convention, rotationMatrixToRollPitchYaw(rotationMatrix, convention) });
            std::cout << rpyList[i].rollPitchYaw.format(vectorFmt) << std::endl;
        }
        return rpyList;
    }

    Eigen::Vector3f rotationMatrixToRotationVector(const Eigen::Matrix3f &rotationMatrix)
    {
        const Eigen::AngleAxisf axisAngle(rotationMatrix);
        return axisAngle.angle() * axisAngle.axis();
    }

    Eigen::Matrix3f rotationVectorToRotationMatrix(const Eigen::Vector3f &rotationVector)
    {
        Eigen::AngleAxisf axisAngle(rotationVector.norm(), rotationVector.normalized());
        return axisAngle.toRotationMatrix();
    }

    void printHeader(const std::string &txt)
    {
        const std::string asterixLine = "****************************************************************";
        std::cout << asterixLine << "\n* " << txt << std::endl << asterixLine << std::endl;
    }

    Eigen::Affine3f zividToEigen(const Zivid::Matrix4x4 &zividMatrix)
    {
        Eigen::Matrix4f eigenMatrix;
        for(std::size_t row = 0; row < Zivid::Matrix4x4::rows; row++)
        {
            for(std::size_t column = 0; column < Zivid::Matrix4x4::cols; column++)
            {
                eigenMatrix(row, column) = zividMatrix(row, column);
            }
        }
        Eigen::Affine3f eigenTransform{ eigenMatrix };
        return eigenTransform;
    }

    Zivid::Matrix4x4 eigenToZivid(const Eigen::Affine3f &eigenTransform)
    {
        Eigen::Matrix4f eigenMatrix = eigenTransform.matrix();
        Zivid::Matrix4x4 zividMatrix;
        for(Eigen::Index row = 0; row < eigenMatrix.rows(); row++)
        {
            for(Eigen::Index column = 0; column < eigenMatrix.cols(); column++)
            {
                zividMatrix(row, column) = eigenMatrix(row, column);
            }
        }
        return zividMatrix;
    }
} // namespace

int main()
{
    try
    {
        Zivid::Application zivid;

        std::cout << std::setprecision(4);
        Eigen::IOFormat matrixFormatRules(4, 0, ", ", "\n", "[", "]", "[", "]");
        Eigen::IOFormat vectorFormatRules(4, 0, ", ", "", "", "", "[", "]");
        printHeader("This example shows conversions to/from Transformation Matrix");

        Zivid::Matrix4x4 transformationMatrixZivid(std::string(ZIVID_SAMPLE_DATA_DIR) + "/RobotTransform.yaml");
        const Eigen::Affine3f transformationMatrix = zividToEigen(transformationMatrixZivid);
        std::cout << transformationMatrix.matrix().format(matrixFormatRules) << std::endl;

        // Extract Rotation Matrix and Translation Vector from Transformation Matrix
        const Eigen::Matrix3f rotationMatrix = transformationMatrix.linear();
        const Eigen::Vector3f translationVector = transformationMatrix.translation();
        std::cout << "RotationMatrix:\n" << rotationMatrix.format(matrixFormatRules) << std::endl;
        std::cout << "TranslationVector:\n" << translationVector.format(vectorFormatRules) << std::endl;

        /*
         * Convert from Rotation Matrix (Zivid) to other representations of orientation (Robot)
         */
        printHeader("Convert from Zivid (Rotation Matrix) to Robot");
        const Eigen::AngleAxisf axisAngle(rotationMatrix);
        std::cout << "AxisAngle:\n"
                  << axisAngle.axis().format(vectorFormatRules) << ", " << axisAngle.angle() << std::endl;
        const Eigen::Vector3f rotationVector = rotationMatrixToRotationVector(rotationMatrix);
        std::cout << "Rotation Vector:\n" << rotationVector.format(vectorFormatRules) << std::endl;
        const Eigen::Quaternionf quaternion(rotationMatrix);
        std::cout << "Quaternion:\n" << quaternion.coeffs().format(vectorFormatRules) << std::endl;
        const auto rpyList = rotationMatrixToRollPitchYawList(rotationMatrix);

        /*
         * Convert to Rotation Matrix (Zivid) from other representations of orientation (Robot)
         */
        printHeader("Convert from Robot to Zivid (Rotation Matrix)");
        const Eigen::Matrix3f rotationMatrixFromAxisAngle = axisAngle.toRotationMatrix();
        std::cout << "Rotation Matrix from Axis Angle:\n"
                  << rotationMatrixFromAxisAngle.format(matrixFormatRules) << std::endl;
        const Eigen::Matrix3f rotationMatrixFromRotationVector = rotationVectorToRotationMatrix(rotationVector);
        std::cout << "Rotation Matrix from Rotation Vector:\n"
                  << rotationMatrixFromRotationVector.format(matrixFormatRules) << std::endl;
        const Eigen::Matrix3f rotationMatrixFromQuaternion = quaternion.toRotationMatrix();
        std::cout << "Rotation Matrix from Quaternion:\n"
                  << rotationMatrixFromQuaternion.format(matrixFormatRules) << std::endl;
        rollPitchYawListToRotationMatrix(rpyList);

        // Combine Rotation Matrix with Translation Vector to form Transformation Matrix
        Eigen::Affine3f transformationMatrixFromQuaternion(rotationMatrixFromQuaternion);
        transformationMatrixFromQuaternion.translation() = translationVector;
        Zivid::Matrix4x4 transformationMatrixFromQuaternionZivid = eigenToZivid(transformationMatrixFromQuaternion);
        transformationMatrixFromQuaternionZivid.save("RobotTransformOut.yaml");
    }

    catch(const std::exception &e)
    {
        std::cerr << "Error: " << e.what() << std::endl;
        std::cout << "Press enter to exit." << std::endl;
        std::cin.get();
        return EXIT_FAILURE;
    }

    return EXIT_SUCCESS;
}
Go to source

source

/*
Convert to/from Transformation Matrix (Rotation Matrix + Translation Vector)

Zivid primarily operate with a (4x4) transformation matrix. This example implements functions to convert to and from:
AxisAngle, Rotation Vector, Roll-Pitch-Yaw, Quaternion.

The convenience functions from this example can be reused in applicable applications. The YAML files for this sample
can be found under the main instructions for Zivid samples.
*/

using MathNet.Numerics.LinearAlgebra;
using MathNet.Numerics.LinearAlgebra.Double;
using System;

class Program
{
    static void Main()
    {
        try
        {
            var zivid = new Zivid.NET.Application();

            printHeader("This example shows conversions to/from Transformation Matrix");

            var transformationMatrixZivid = new Zivid.NET.Matrix4x4(Environment.GetFolderPath(Environment.SpecialFolder.CommonApplicationData) + "/Zivid/RobotTransform.yaml");
            var transformationMatrix = zividToMathDotNet(transformationMatrixZivid);
            Console.WriteLine(matrixToString(transformationMatrix));

            // Extract Rotation Matrix and Translation Vector from Transformation Matrix
            var rotationMatrix = transformationMatrix.SubMatrix(0, 3, 0, 3);
            var translationVector = transformationMatrix.SubMatrix(0, 3, 3, 1);
            Console.WriteLine("RotationMatrix:\n" + matrixToString(rotationMatrix));
            Console.WriteLine("TranslationVector:\n" + matrixToString(translationVector.Transpose()));

            /*
             * Convert from Rotation Matrix (Zivid) to other representations of orientation (Robot)
             */
            printHeader("Convert from Zivid (Rotation Matrix) to Robot");
            var axisAngle = rotationMatrixToAxisAngle(rotationMatrix);
            Console.WriteLine("AxisAngle:\n" + matrixToString(axisAngle.Axis.Transpose()) + ", " + String.Format(" {0:G4} ", axisAngle.Angle));
            var rotationVector = axisAngle.Axis * axisAngle.Angle;
            Console.WriteLine("Rotation Vector:\n" + matrixToString(rotationVector.Transpose()));
            var quaternion = rotationMatrixToQuaternion(rotationMatrix);
            Console.WriteLine("Quaternion:\n" + matrixToString(quaternion.Transpose()));
            var rpyList = rotationMatrixToRollPitchYawList(rotationMatrix);

            /*
             * Convert to Rotation Matrix (Zivid) from other representations of orientation (Robot)
             */
            printHeader("Convert from Robot to Zivid (Rotation matrix)");
            var rotationMatrixFromAxisAngle = axisAngleToRotationMatrix(axisAngle);
            Console.WriteLine("Rotation Matrix from Axis Angle:\n" + matrixToString(rotationMatrixFromAxisAngle));
            var rotationMatrixFromRotationVector = rotationVectorToRotationMatrix(rotationVector);
            Console.WriteLine("Rotation Matrix from Rotation Vector:\n" + matrixToString(rotationMatrixFromRotationVector));
            var rotationMatrixFromQuaternion = quaternionToRotationMatrix(quaternion);
            Console.WriteLine("Rotation Matrix from Quaternion:\n" + matrixToString(rotationMatrixFromQuaternion));
            rollPitchYawListToRotationMatrix(rpyList);

            // Combine Rotation Matrix with Translation Vector to form Transformation Matrix
            var transformationMatrixFromQuaternion = Matrix<double>.Build.Dense(4, 4);
            transformationMatrixFromQuaternion.SetSubMatrix(0, 3, 0, 3, rotationMatrixFromQuaternion);
            transformationMatrixFromQuaternion.SetSubMatrix(0, 3, 3, 1, translationVector);
            var transformationMatrixFromQuaternionZivid = mathDotNetToZivid(transformationMatrixFromQuaternion);
            transformationMatrixFromQuaternionZivid.Save("RobotTransformOut.yaml");
        }
        catch (Exception ex)
        {
            Console.WriteLine("Error: {0}", ex.ToString());
            Environment.ExitCode = 1;
        }
    }

    enum RotationConvention
    {
        zyxIntrinsic,
        xyzExtrinsic,
        xyzIntrinsic,
        zyxExtrinsic
    };

    public class AxisAngle
    {
        public AxisAngle(Matrix<double> axis, double angle)
        {
            this.Axis = axis;
            this.Angle = angle;
        }

        public Matrix<double> Axis { get; set; }

        public double Angle { get; set; }
    }

    static Matrix<double> skew(Matrix<double> vector)
    // Assumes vector to be [3x1]
    {
        return CreateMatrix.DenseOfArray<double>(new double[,]
        {
            { 0, -vector[2,0], vector[1,0] },
            { vector[2,0], 0, -vector[0,0] },
            { -vector[1,0], vector[0,0], 0 }
        });
    }

    static AxisAngle rotationMatrixToAxisAngle(Matrix<double> rotationMatrix)
    {
        // See Rodrigues' formula or skew-symmetric method
        var skewSymmetricMatrix = (rotationMatrix - rotationMatrix.Transpose()) / 2;
        Matrix<double> skewElements = CreateMatrix.DenseOfArray<double>(new double[,] { { skewSymmetricMatrix[2, 1] }, { skewSymmetricMatrix[0, 2] }, { skewSymmetricMatrix[1, 0] } });
        var skewNorm = skewElements.Column(0, 0, 3).L2Norm();
        Matrix<double> u = skewElements / skewNorm;
        var theta = Math.Atan2(skewNorm, (rotationMatrix.Trace() - 1) / 2);

        return new AxisAngle(u, theta);
    }

    static Matrix<double> axisAngleToRotationMatrix(AxisAngle axisAngle)
    {
        //See Rodrigues' formula or skew-symmetric method
        var u = axisAngle.Axis;
        var firstTerm = CreateMatrix.DenseIdentity<double>(3) * Math.Cos(axisAngle.Angle);
        var secondTerm = u.Multiply(u.Transpose()) * (1 - Math.Cos(axisAngle.Angle));
        var thirdTerm = skew(u) * Math.Sin(axisAngle.Angle);
        return firstTerm + secondTerm + thirdTerm;
    }

    static Matrix<double> rotationVectorToRotationMatrix(Matrix<double> rotationVector)
    {
        double theta = rotationVector.L2Norm();
        return axisAngleToRotationMatrix(new AxisAngle(rotationVector / theta, theta));
    }

    static Matrix<double> rotationMatrixToQuaternion(Matrix<double> rotationMatrix)
    {
        var qw = (Math.Sqrt(1 + rotationMatrix[0, 0] + rotationMatrix[1, 1] + rotationMatrix[2, 2])) / 2;
        Matrix<double> quaternion = CreateMatrix.DenseOfArray<double>(new double[,]
        {
            { (rotationMatrix[2,1] - rotationMatrix[1,2])/(4*qw) },
            { (rotationMatrix[0,2] - rotationMatrix[2,0])/(4*qw)  },
            { (rotationMatrix[1,0] - rotationMatrix[0,1])/(4*qw) },
            { qw } // REAL PART
        });

        return quaternion;
    }

    static Matrix<double> quaternionToRotationMatrix(Matrix<double> quaternion)
    {
        // Normalize quaternion
        var nQ = quaternion / quaternion.L2Norm();
        var firstTerm = CreateMatrix.DenseIdentity<double>(3);
        var secondTerm = 2 * skew(nQ.SubMatrix(0, 3, 0, 1)) * skew(nQ.SubMatrix(0, 3, 0, 1));
        var thirdTerm = 2 * nQ[3, 0] * skew(nQ.SubMatrix(0, 3, 0, 1));

        return firstTerm + secondTerm + thirdTerm;
    }

    static Matrix createRotationMatrix(string axis, double angle)
    {
        var matrix = DenseMatrix.CreateIdentity(3);
        var cosAngle = Math.Cos(angle);
        var sinAngle = Math.Sin(angle);
        switch (axis)
        {
            case ("x"):
                matrix[1, 1] = cosAngle;
                matrix[2, 2] = cosAngle;
                matrix[1, 2] = -sinAngle;
                matrix[2, 1] = sinAngle;
                break;
            case ("y"):
                matrix[0, 0] = cosAngle;
                matrix[2, 2] = cosAngle;
                matrix[0, 2] = sinAngle;
                matrix[2, 0] = -sinAngle;
                break;
            case ("z"):
                matrix[0, 0] = cosAngle;
                matrix[1, 1] = cosAngle;
                matrix[0, 1] = -sinAngle;
                matrix[1, 0] = sinAngle;
                break;
            default: throw new Exception("Wrong axis; options are x, y, z.");
        }
        return matrix;
    }

    static Matrix createXRotation(double angle)
    {
        return createRotationMatrix("x", angle);
    }

    static Matrix createYRotation(double angle)
    {
        return createRotationMatrix("y", angle);
    }

    static Matrix createZRotation(double angle)
    {
        return createRotationMatrix("z", angle);
    }

    static Vector<double> rotationMatrixToRollPitchYaw(Matrix<double> rotationMatrix, RotationConvention convention)
    {
        double roll = -1;
        double pitch = -1;
        double yaw = -1;

        switch (convention)
        {
            case RotationConvention.zyxExtrinsic:
            case RotationConvention.xyzIntrinsic:
                if (rotationMatrix[0, 2] < 1)
                {
                    if (rotationMatrix[0, 2] > -1)
                    {
                        roll = Math.Atan2(-rotationMatrix[1, 2], rotationMatrix[2, 2]);
                        pitch = Math.Asin(rotationMatrix[0, 2]);
                        yaw = Math.Atan2(-rotationMatrix[0, 1], rotationMatrix[0, 0]);
                    }
                    else // R02 = −1

                    {
                        // Not a unique solution: yaw − roll = atan2(R01, R11)
                        roll = -Math.Atan2(rotationMatrix[1, 0], rotationMatrix[1, 1]);
                        pitch = -Math.PI / 2;
                        yaw = 0;
                    }

                }
                else // R02 = +1
                {
                    // Not a unique solution: yaw + roll = atan2(R10, R11)
                    roll = Math.Atan2(rotationMatrix[1, 0], rotationMatrix[1, 1]);
                    pitch = Math.PI / 2;
                    yaw = 0;
                }
                break;
            case RotationConvention.xyzExtrinsic:
            case RotationConvention.zyxIntrinsic:
                if (rotationMatrix[2, 0] < 1)
                {
                    if (rotationMatrix[2, 0] > -1)
                    {
                        roll = Math.Atan2(rotationMatrix[1, 0], rotationMatrix[0, 0]);
                        pitch = Math.Asin(-rotationMatrix[2, 0]);
                        yaw = Math.Atan2(rotationMatrix[2, 1], rotationMatrix[2, 2]);
                    }
                    else // R20 = −1

                    {
                        // Not a unique solution: yaw − roll = atan2(−R12, R11)
                        roll = -Math.Atan2(rotationMatrix[1, 2], rotationMatrix[1, 1]);
                        pitch = Math.PI / 2;
                        yaw = 0;
                    }

                }
                else // R20 = +1
                {
                    // Not a unique solution: yaw + roll = atan2(−R12, R11)
                    roll = Math.Atan2(rotationMatrix[1, 2], rotationMatrix[1, 1]);
                    pitch = -Math.PI / 2;
                    yaw = 0;
                }
                break;
        }
        switch (convention)
        {
            case RotationConvention.zyxExtrinsic:
            case RotationConvention.xyzExtrinsic:
                return CreateVector.DenseOfArray<double>(new double[] { yaw, pitch, roll });
            case RotationConvention.xyzIntrinsic:
            case RotationConvention.zyxIntrinsic:
                return CreateVector.DenseOfArray<double>(new double[] { roll, pitch, yaw });
        }
        throw new ArgumentException("Invalid RotationConvention");
    }

    static Vector<double>[] rotationMatrixToRollPitchYawList(Matrix<double> rotationMatrix)
    {
        Vector<double>[] rpyList = new Vector<double>[Enum.GetValues(typeof(RotationConvention)).Length];
        foreach (int i in Enum.GetValues(typeof(RotationConvention)))
        {
            RotationConvention convention = (RotationConvention)i;
            Console.WriteLine("Roll-Pitch-Yaw angles (" + convention + "):");
            rpyList[i] = rotationMatrixToRollPitchYaw(rotationMatrix, convention);
            Console.WriteLine(vectorToString(rpyList[i]));
        }
        return rpyList;
    }

    static Matrix<double> rollPitchYawToRotationMatrix(Vector<double> rollPitchYaw, RotationConvention convention)
    {
        switch (convention)
        {
            case RotationConvention.xyzIntrinsic:
                return createXRotation(rollPitchYaw[0]) * createYRotation(rollPitchYaw[1]) * createZRotation(rollPitchYaw[2]);
            case RotationConvention.zyxIntrinsic:
                return createZRotation(rollPitchYaw[0]) * createYRotation(rollPitchYaw[1]) * createXRotation(rollPitchYaw[2]);
            case RotationConvention.zyxExtrinsic:
                return createXRotation(rollPitchYaw[2]) * createYRotation(rollPitchYaw[1]) * createZRotation(rollPitchYaw[0]);
            case RotationConvention.xyzExtrinsic:
                return createZRotation(rollPitchYaw[2]) * createYRotation(rollPitchYaw[1]) * createXRotation(rollPitchYaw[0]);
            default: throw new ArgumentException("Invalid RotationConvention");
        }
    }

    static void rollPitchYawListToRotationMatrix(Vector<double>[] rpyList)
    {
        foreach (int i in Enum.GetValues(typeof(RotationConvention)))
        {
            RotationConvention convention = (RotationConvention)i;
            Console.WriteLine("Rotation Matrix from Roll-Pitch-Yaw angles (" + convention + "):");
            Console.WriteLine(matrixToString(rollPitchYawToRotationMatrix(rpyList[i], convention)));
        }
    }

    static Matrix<double> zividToMathDotNet(Zivid.NET.Matrix4x4 zividMatrix)
    {
        return CreateMatrix.DenseOfArray(zividMatrix.ToArray()).ToDouble();
    }
    static Zivid.NET.Matrix4x4 mathDotNetToZivid(Matrix<double> mathNetMatrix)
    {
        return new Zivid.NET.Matrix4x4(mathNetMatrix.ToSingle().ToArray());
    }

    static void printHeader(string text)
    {
        string asterixLine = "****************************************************************";
        Console.WriteLine(asterixLine + "\n* " + text + "\n" + asterixLine);
    }

    static string matrixToString(Matrix<double> matrix)
    {
        string matrixString = "";
        if (matrix.RowCount != 1)
        {
            matrixString = "[";
        }
        for (var i = 0; i < matrix.RowCount; i++)
        {
            matrixString += "[";
            for (var j = 0; j < matrix.ColumnCount; j++)
            {
                matrixString += String.Format(" {0,9:G4} ", matrix[i, j]);
            }
            matrixString += "]\n ";
        }
        matrixString = matrixString.TrimEnd(' ');
        matrixString = matrixString.TrimEnd('\n');
        if (matrix.RowCount != 1)
        {
            matrixString += "]";
        }
        return matrixString;
    }

    static string vectorToString(Vector<double> vector)
    {
        string vectorString = "[";
        for (var i = 0; i < vector.Count; i++)
        {
            vectorString += String.Format("{0,9:G4}", vector[i]);
        }
        vectorString += "]";

        return vectorString;
    }
}
Go to source

source

"""
Convert to/from Transformation Matrix (Rotation Matrix + Translation Vector).

Zivid primarily operate with a (4x4) transformation matrix. This example shows how to use Eigen to convert to and from:
AxisAngle, Rotation Vector, Roll-Pitch-Yaw, Quaternion.

The convenience functions from this example can be reused in applicable applications. The YAML files for this sample
can be found under the main instructions for Zivid samples.

"""

import enum
from dataclasses import dataclass, field
from pathlib import Path
from typing import Dict, List, Optional

import numpy as np
import zivid
from scipy.spatial.transform import Rotation as R
from zividsamples.paths import get_sample_data_path
from zividsamples.save_load_matrix import assert_affine_matrix_and_save, load_and_assert_affine_matrix


class RotationConvention(enum.Enum):
    """Convenience enum class to list rotation conventions for Roll Pitch Yaw."""

    ZYX_INTRINSIC = "ZYX"
    XYZ_EXTRINSIC = "xyz"
    XYZ_INTRINSIC = "XYZ"
    ZYX_EXTRINSIC = "zyx"


class AxisAngle:
    """Convenience class to access rotation axis and angle."""

    axis: np.ndarray
    angle: np.floating

    def __init__(self, axis: np.ndarray = np.array([0, 0, 1]), angle: Optional[float] = None):
        """Initialize class and its variables.

        Can be initialized with a unit vector and an angle, or only a rotation vector.

        Args:
            axis: Rotation axis
            angle: Rotation angle

        Raises:
            ValueError: If angle vector is provided, but vector is not a unit vector

        """
        self.axis = axis
        if angle is None:
            self.angle = np.linalg.norm(axis)
            self.axis = axis / self.angle
        elif np.linalg.norm(axis) != 0:
            raise ValueError("Angle provided, but vector is not unit vector")
        else:
            self.angle: np.floating = np.floating(angle)

    def as_rotvec(self) -> np.ndarray:
        """Return rotation vector from axis angle.

        Returns:
            Rotation vector

        """
        return self.axis * self.angle

    def as_quaternion(self) -> np.ndarray:
        """Return quaternion from axis angle.

        Returns:
            Quaternion

        """
        return R.from_rotvec(self.as_rotvec()).as_quat()


@dataclass
class Representations:
    """Class to hold various transformation representations."""

    axis_angle: AxisAngle = AxisAngle()
    rotation_vector: np.ndarray = field(default_factory=lambda: np.zeros(3))
    quaternion: np.ndarray = field(default_factory=lambda: np.zeros(4))
    rotations: list = field(default_factory=list)


def rotation_matrix_to_axis_angle(rotation_matrix: np.ndarray) -> AxisAngle:
    """Convert from Rotation Matrix --> Axis Angle.

    Args:
        rotation_matrix: A numpy array (3x3)

    Returns:
        AxisAngle

    """
    rotation = R.from_matrix(rotation_matrix)
    return AxisAngle(rotation.as_rotvec())


def rotation_matrix_to_rotation_vector(rotation_matrix: np.ndarray) -> np.ndarray:
    """Convert from Rotation Matrix --> Rotation Vector.

    Args:
        rotation_matrix: A numpy array (3x3)

    Returns:
        Rotation Vector

    """
    rotation = R.from_matrix(rotation_matrix)
    return rotation.as_rotvec()


def rotation_matrix_to_quaternion(rotation_matrix: np.ndarray) -> np.ndarray:
    """Convert from Rotation Matrix --> Quaternion.

    Args:
        rotation_matrix: A numpy array (3x3)

    Returns:
        Quaternion

    """
    rotation = R.from_matrix(rotation_matrix)
    return rotation.as_quat()


def rotation_matrix_to_roll_pitch_yaw(rotation_matrix: np.ndarray) -> List[Dict]:
    """Convert from Rotation Matrix --> Roll Pitch Yaw.

    Args:
        rotation_matrix: A numpy array (3x3)

    Returns:
        rpy_list: List of Roll Pitch Yaw angles in radians

    """
    rpy_list = []
    rotation = R.from_matrix(rotation_matrix)
    for convention in RotationConvention:
        roll_pitch_yaw = rotation.as_euler(convention.value)
        print(f"Roll-Pitch-Yaw angles ({convention.name}):")
        print(f"{roll_pitch_yaw}")
        rpy_list.append({"convention": convention, "roll_pitch_yaw": roll_pitch_yaw})
    return rpy_list


def axis_angle_to_rotation_matrix(axis_angle: AxisAngle) -> np.ndarray:
    """Convert from AxisAngle --> Rotation Matrix.

    Args:
        axis_angle: An AxisAngle object with axis and angle

    Returns:
        Rotation Matrix (3x3 numpy array)

    """
    return R.from_quat(axis_angle.as_quaternion()).as_matrix()


def rotation_vector_to_rotation_matrix(rotvec: np.ndarray) -> np.ndarray:
    """Convert from Rotation Vector --> Rotation Matrix.

    Args:
        rotvec: A 3x1 numpy array

    Returns:
        Rotation Matrix (3x3 numpy array)

    """
    return R.from_rotvec(rotvec).as_matrix()


def quaternion_to_rotation_matrix(quaternion: np.ndarray) -> np.ndarray:
    """Convert from Quaternion --> Rotation Matrix.

    Args:
        quaternion: A 4x1 numpy array

    Returns:
        Rotation Matrix (3x3 numpy array)

    """
    return R.from_quat(quaternion).as_matrix()


def roll_pitch_yaw_to_rotation_matrix(rpy_list: List[Dict]) -> None:
    """Convert from Roll Pitch Yaw --> Rotation Matrix.

    Args:
        rpy_list: List of Roll Pitch Yaw angles in radians

    """
    for rotation in rpy_list:
        rotation_matrix = R.from_euler(rotation["convention"].value, rotation["roll_pitch_yaw"]).as_matrix()
        print(f"Rotation Matrix from Roll-Pitch-Yaw angles ({rotation['convention'].name}):")
        print(f"{rotation_matrix}")


def print_header(txt: str) -> None:
    """Print decorated header.

    Args:
        txt: Text to be printed in header

    """
    terminal_width = 70
    print()
    print(f"{'*' * terminal_width}")
    print(f"* {txt} {' ' * (terminal_width - len(txt) - 4)}*")
    print(f"{'*' * terminal_width}")


def _main() -> None:
    # Application class must be initialized before using other Zivid classes.
    app = zivid.Application()  # noqa: F841  # pylint: disable=unused-variable

    np.set_printoptions(precision=4, suppress=True)
    print_header("This example shows conversions to/from Transformation Matrix")

    transformation_matrix = load_and_assert_affine_matrix(get_sample_data_path() / "RobotTransform.yaml")
    print(f"Transformation Matrix:\n{transformation_matrix}")

    # Extract Rotation Matrix and Translation Vector from Transformation Matrix
    rotation_matrix = transformation_matrix[:3, :3]
    translation_vector = transformation_matrix[:-1, -1]
    print(f"Rotation Matrix:\n{rotation_matrix}")
    print(f"Translation Vector:\n{translation_vector}")

    ###
    # Convert from Zivid to Robot (Transformation Matrix --> any format)
    ###
    print_header("Convert from Zivid (Rotation Matrix) to Robot")
    axis_angle = rotation_matrix_to_axis_angle(rotation_matrix)
    print(f"AxisAngle:\n{axis_angle.axis}, {axis_angle.angle:.4f}")
    rotation_vector = rotation_matrix_to_rotation_vector(rotation_matrix)
    print(f"Rotation Vector:\n{rotation_vector}")
    quaternion = rotation_matrix_to_quaternion(rotation_matrix)
    print(f"Quaternion:\n{quaternion}")
    rpy_list = rotation_matrix_to_roll_pitch_yaw(rotation_matrix)

    ###
    # Convert from Robot to Zivid (any format --> Rotation Matrix (part of Transformation Matrix))
    ###
    print_header("Convert from Robot to Zivid (Rotation Matrix)")
    rotation_matrix_from_axis_angle = axis_angle_to_rotation_matrix(axis_angle)
    print(f"Rotation Matrix from Axis Angle:\n{rotation_matrix_from_axis_angle}")
    rotation_matrix_from_rotation_vector = rotation_vector_to_rotation_matrix(rotation_vector)
    print(f"Rotation Matrix from Rotation Vector:\n{rotation_matrix_from_rotation_vector}")
    rotation_matrix_from_quaternion = quaternion_to_rotation_matrix(quaternion)
    print(f"Rotation Matrix from Quaternion:\n{rotation_matrix_from_quaternion}")
    roll_pitch_yaw_to_rotation_matrix(rpy_list)

    # Replace rotation matrix in transformation matrix
    transformation_matrix_from_quaternion = np.eye(4)
    transformation_matrix_from_quaternion[:3, :3] = rotation_matrix_from_quaternion
    transformation_matrix_from_quaternion[:-1, -1] = translation_vector
    # Save transformation matrix which has passed through quaternion representation
    assert_affine_matrix_and_save(
        transformation_matrix_from_quaternion, Path(__file__).parent / "RobotTransformOut.yaml"
    )


if __name__ == "__main__":
    _main()

Now that you have chosen the hand-eye calibration approach, check out How to get Good Dataset for Hand-Eye Calibration.