捕获教程

介绍

本教程介绍了如何使用Zivid SDK捕获点云和2D图像。

先决条件

初始化

调用Zivid SDK中的任何API都需要初始化Zivid application并在程序运行时使其保持活动状态。

备注

Zivid::Application 必须在操作Zivid相机时保持活动状态。这本质上是Zivid的驱动程序。

跳转到源码

源码

Zivid::Application zivid;
跳转到源码

source

var zivid = new Zivid.NET.Application();
跳转到源码

源码

app = zivid.Application()

连接

现在我们可以连接到相机了。

跳转到源码

源码

auto camera = zivid.connectCamera();
跳转到源码

source

var camera = zivid.ConnectCamera();
跳转到源码

源码

camera = app.connect_camera()

选定相机

当多台相机连接到同一台计算机时,如果需要在代码中使用某台特定的相机,可以通过该相机的序列号来实现。

auto camera = zivid.connectCamera(Zivid::CameraInfo::SerialNumber{ "2020C0DE" });
var camera = zivid.ConnectCamera(new Zivid.NET.CameraInfo.SerialNumber("2020C0DE"));
camera = app.connect_camera(serial_number="2020C0DE")

备注

Zivid Studio中显示的相机序列号。

您还可以列出所有连接到计算机的相机,并通过以下方式查看它们的序列号

跳转到源码

源码

auto cameras = zivid.cameras();
std::cout << "Found " << cameras.size() << " cameras" << std::endl;
for(auto &camera : cameras)
{
    std::cout << "Camera Info: " << camera.info() << std::endl;
    std::cout << "Camera State: " << camera.state() << std::endl;
}
跳转到源码

源码

var cameras = zivid.Cameras;
Console.WriteLine("Number of cameras found: {0}", cameras.Count);
foreach (var camera in cameras)
{
    Console.WriteLine("Camera Info: {0}", camera.Info);
    Console.WriteLine("Camera State: {0}", camera.State);
}
跳转到源码

源码

cameras = app.cameras()
for camera in cameras:
    print(f"Camera Info:  {camera.info}")
    print(f"Camera State: {camera.state}")

文件相机

文件相机选项允许您在不访问物理相机的情况下试验SDK。文件相机可以在 Sample Data(示例数据) 中找到,其中有多个文件相机可供选择。每个文件相机都在相应相机型号的主要应用程序之一中演示了一个用例。下面的示例显示了如何使用来自 Sample Data(示例数据) 中的Zivid 2 M70文件相机来创建一个文件相机。

跳转到源码

源码

const auto fileCamera =
    userInput ? fileCameraPath : std::string(ZIVID_SAMPLE_DATA_DIR) + "/FileCameraZivid2PlusMR60.zfc";
跳转到源码

源码

fileCamera = Environment.GetFolderPath(Environment.SpecialFolder.CommonApplicationData) + "/Zivid/FileCameraZivid2PlusMR60.zfc";
跳转到源码

源码

default=get_sample_data_path() / "FileCameraZivid2PlusMR60.zfc",

跳转到源码

源码

auto camera = zivid.createFileCamera(fileCamera);
跳转到源码

源码

var camera = zivid.CreateFileCamera(fileCamera);
跳转到源码

源码

camera = app.create_file_camera(file_camera)

采集设置应当如下所示进行初始化,但您可以自由更改处理设置。

跳转到源码

source

Zivid::Settings settings{
    Zivid::Settings::Acquisitions{ Zivid::Settings::Acquisition{} },
    Zivid::Settings::Processing::Filters::Smoothing::Gaussian::Enabled::yes,
    Zivid::Settings::Processing::Filters::Smoothing::Gaussian::Sigma{ 1.5 },
    Zivid::Settings::Processing::Filters::Reflection::Removal::Enabled::yes,
    Zivid::Settings::Processing::Filters::Reflection::Removal::Mode::global,
};
Zivid::Settings2D settings2D{ Zivid::Settings2D::Acquisitions{ Zivid::Settings2D::Acquisition{} },
                              Zivid::Settings2D::Processing::Color::Balance::Red{ 1 },
                              Zivid::Settings2D::Processing::Color::Balance::Green{ 1 },
                              Zivid::Settings2D::Processing::Color::Balance::Blue{ 1 } };

settings.color() = Zivid::Settings::Color{ settings2D };
跳转到源码

source

var settings2D = new Zivid.NET.Settings2D
{
    Acquisitions = { new Zivid.NET.Settings2D.Acquisition { } },
    Processing =
    {
        Color =
        {
            Balance = { Red = 1.0, Green = 1.0, Blue = 1.0 }
        }
    }
};
var settings = new Zivid.NET.Settings
{
    Acquisitions = { new Zivid.NET.Settings.Acquisition { } },
    Processing =
    {
        Filters =
        {
            Smoothing =
            {
                Gaussian = { Enabled = true, Sigma = 1.5 }
            },
            Reflection =
            {
                Removal = { Enabled = true, Mode = ReflectionFilterModeOption.Global}
            }
        }
    }
};
settings.Color = settings2D;
跳转到源码

source

settings = zivid.Settings()
settings.acquisitions.append(zivid.Settings.Acquisition())
settings.processing.filters.smoothing.gaussian.enabled = True
settings.processing.filters.smoothing.gaussian.sigma = 1
settings.processing.filters.reflection.removal.enabled = True
settings.processing.filters.reflection.removal.mode = "global"

settings_2d = zivid.Settings2D()
settings_2d.acquisitions.append(zivid.Settings2D.Acquisition())
settings_2d.processing.color.balance.blue = 1.0
settings_2d.processing.color.balance.green = 1.0
settings_2d.processing.color.balance.red = 1.0

settings.color = settings_2d

您可以在 File Camera(文件相机) 中阅读有关文件相机选项的更多信息。

配置

与所有相机一样,都有可以配置的设置。

Presets(预设值)

The recommendation is to use Presets(预设值) available in Zivid Studio and as .yml files (see below). Presets are designed to work well for most cases right away, making them a great starting point. If needed, you can easily fine-tune the settings for better results. You can edit the YAML files in any text editor or code the settings manually.

加载

You can export camera settings to .yml files from Zivid Studio. These can be loaded and applied in the API.

跳转到源码

source

const auto settingsFile = "Settings.yml";
std::cout << "Loading settings from file: " << settingsFile << std::endl;
const auto settingsFromFile = Zivid::Settings(settingsFile);
跳转到源码

source

var settingsFile = "Settings.yml";
Console.WriteLine("Loading settings from file: " + settingsFile);
var settingsFromFile = new Zivid.NET.Settings(settingsFile);
跳转到源码

source

settings_file = "Settings.yml"
print(f"Loading settings from file: {settings_file}")
settings_from_file = zivid.Settings.load(settings_file)

保存

您还可以将设置保存到.yml文件。

跳转到源码

source

const auto settingsFile = "Settings.yml";
std::cout << "Saving settings to file: " << settingsFile << std::endl;
settings.save(settingsFile);
跳转到源码

source

var settingsFile = "Settings.yml";
Console.WriteLine("Saving settings to file: " + settingsFile);
settings.Save(settingsFile);
跳转到源码

source

settings_file = "Settings.yml"
print(f"Saving settings to file: {settings_file}")
settings.save(settings_file)

手动配置

另一种选择是手动配置设置。有关每个设置的作用的更多信息,请参阅 相机设置 。然后,下一步是 捕获高质量的点云

Single 2D and 3D Acquisition - Default settings

我们可以为单个采集捕获创建设置。

跳转到源码

source

const auto settings = Zivid::Settings(
    Zivid::Settings::Acquisitions{ Zivid::Settings::Acquisition{} },
    Zivid::Settings::Color(
        Zivid::Settings2D(Zivid::Settings2D::Acquisitions{ Zivid::Settings2D::Acquisition{} })));
跳转到源码

source

var settings = new Zivid.NET.Settings
{
    Acquisitions = { new Zivid.NET.Settings.Acquisition { } },
    Color = new Zivid.NET.Settings2D { Acquisitions = { new Zivid.NET.Settings2D.Acquisition { } } }
};
跳转到源码

source

settings = zivid.Settings(
    acquisitions=[zivid.Settings.Acquisition()],
    color=zivid.Settings2D(acquisitions=[zivid.Settings2D.Acquisition()]),
)

多次采集HDR

我们还可以创建在多帧采集的HDR捕获中使用的设置。

跳转到源码

source

Zivid::Settings settings;
for(const auto aperture : { 9.57, 4.76, 2.59 })
{
    std::cout << "Adding acquisition with aperture = " << aperture << std::endl;
    const auto acquisitionSettings = Zivid::Settings::Acquisition{
        Zivid::Settings::Acquisition::Aperture{ aperture },
    };
    settings.acquisitions().emplaceBack(acquisitionSettings);
}
跳转到源码

source

var settings = new Zivid.NET.Settings();
foreach (var aperture in new double[] { 9.57, 4.76, 2.59 })
{
    Console.WriteLine("Adding acquisition with aperture = " + aperture);
    var acquisitionSettings = new Zivid.NET.Settings.Acquisition { Aperture = aperture };
    settings.Acquisitions.Add(acquisitionSettings);
}
跳转到源码

source

settings = zivid.Settings(acquisitions=[zivid.Settings.Acquisition(aperture=fnum) for fnum in (11.31, 5.66, 2.83)])

下面展示了完整的配置设置。

跳转到源码

source

std::cout << "Configuring settings for capture:" << std::endl;
Zivid::Settings2D settings2D{
    Zivid::Settings2D::Sampling::Color::rgb,
    Zivid::Settings2D::Sampling::Pixel::all,

    Zivid::Settings2D::Processing::Color::Balance::Blue{ 1.0 },
    Zivid::Settings2D::Processing::Color::Balance::Green{ 1.0 },
    Zivid::Settings2D::Processing::Color::Balance::Red{ 1.0 },
    Zivid::Settings2D::Processing::Color::Gamma{ 1.0 },

    Zivid::Settings2D::Processing::Color::Experimental::Mode::automatic,
};

Zivid::Settings settings{
    Zivid::Settings::Color{ settings2D },

    Zivid::Settings::Engine::phase,

    Zivid::Settings::RegionOfInterest::Box::Enabled::yes,
    Zivid::Settings::RegionOfInterest::Box::PointO{ 1000, 1000, 1000 },
    Zivid::Settings::RegionOfInterest::Box::PointA{ 1000, -1000, 1000 },
    Zivid::Settings::RegionOfInterest::Box::PointB{ -1000, 1000, 1000 },
    Zivid::Settings::RegionOfInterest::Box::Extents{ -1000, 1000 },

    Zivid::Settings::RegionOfInterest::Depth::Enabled::yes,
    Zivid::Settings::RegionOfInterest::Depth::Range{ 200, 2000 },

    Zivid::Settings::Processing::Filters::Cluster::Removal::Enabled::yes,
    Zivid::Settings::Processing::Filters::Cluster::Removal::MaxNeighborDistance{ 10 },
    Zivid::Settings::Processing::Filters::Cluster::Removal::MinArea{ 100 },

    Zivid::Settings::Processing::Filters::Hole::Repair::Enabled::yes,
    Zivid::Settings::Processing::Filters::Hole::Repair::HoleSize{ 0.2 },
    Zivid::Settings::Processing::Filters::Hole::Repair::Strictness{ 1 },

    Zivid::Settings::Processing::Filters::Noise::Removal::Enabled::yes,
    Zivid::Settings::Processing::Filters::Noise::Removal::Threshold{ 7.0 },

    Zivid::Settings::Processing::Filters::Noise::Suppression::Enabled::yes,
    Zivid::Settings::Processing::Filters::Noise::Repair::Enabled::yes,

    Zivid::Settings::Processing::Filters::Outlier::Removal::Enabled::yes,
    Zivid::Settings::Processing::Filters::Outlier::Removal::Threshold{ 5.0 },

    Zivid::Settings::Processing::Filters::Reflection::Removal::Enabled::yes,
    Zivid::Settings::Processing::Filters::Reflection::Removal::Mode::global,

    Zivid::Settings::Processing::Filters::Smoothing::Gaussian::Enabled::yes,
    Zivid::Settings::Processing::Filters::Smoothing::Gaussian::Sigma{ 1.5 },

    Zivid::Settings::Processing::Filters::Experimental::ContrastDistortion::Correction::Enabled::yes,
    Zivid::Settings::Processing::Filters::Experimental::ContrastDistortion::Correction::Strength{ 0.4 },

    Zivid::Settings::Processing::Filters::Experimental::ContrastDistortion::Removal::Enabled::no,
    Zivid::Settings::Processing::Filters::Experimental::ContrastDistortion::Removal::Threshold{ 0.5 },

    Zivid::Settings::Processing::Resampling::Mode::upsample2x2,

    Zivid::Settings::Diagnostics::Enabled::no,
};

setSamplingPixel(settings, camera);
std::cout << settings << std::endl;
std::cout << "Configuring base acquisition with settings same for all HDR acquisition:" << std::endl;
const auto baseAcquisition = Zivid::Settings::Acquisition{};
std::cout << baseAcquisition << std::endl;
const auto baseAquisition2D = Zivid::Settings2D::Acquisition{};

std::cout << "Configuring acquisition settings different for all HDR acquisitions" << std::endl;
auto exposureValues = getExposureValues(camera);
const std::vector<double> aperture = std::get<0>(exposureValues);
const std::vector<double> gain = std::get<1>(exposureValues);
const std::vector<std::chrono::microseconds> exposureTime = std::get<2>(exposureValues);
const std::vector<double> brightness = std::get<3>(exposureValues);
for(size_t i = 0; i < aperture.size(); ++i)
{
    std::cout << "Acquisition " << i + 1 << ":" << std::endl;
    std::cout << "  Exposure Time: " << exposureTime.at(i).count() << std::endl;
    std::cout << "  Aperture: " << aperture.at(i) << std::endl;
    std::cout << "  Gain: " << gain.at(i) << std::endl;
    std::cout << "  Brightness: " << brightness.at(i) << std::endl;
    const auto acquisitionSettings = baseAcquisition.copyWith(
        Zivid::Settings::Acquisition::Aperture{ aperture.at(i) },
        Zivid::Settings::Acquisition::Gain{ gain.at(i) },
        Zivid::Settings::Acquisition::ExposureTime{ exposureTime.at(i) },
        Zivid::Settings::Acquisition::Brightness{ brightness.at(i) });
    settings.acquisitions().emplaceBack(acquisitionSettings);
}
const auto acquisitionSettings2D = baseAquisition2D.copyWith(
    Zivid::Settings2D::Acquisition::Aperture{ 2.83 },
    Zivid::Settings2D::Acquisition::ExposureTime{ microseconds{ 10000 } },
    Zivid::Settings2D::Acquisition::Brightness{ 1.8 },
    Zivid::Settings2D::Acquisition::Gain{ 1.0 });
settings.color().value().acquisitions().emplaceBack(acquisitionSettings2D);
跳转到源码

source

Console.WriteLine("Configuring settings for capture:");
var settings2D = new Zivid.NET.Settings2D()
{
    Sampling =
    {
        Color = Zivid.NET.Settings2D.SamplingGroup.ColorOption.Rgb,
        Pixel = Zivid.NET.Settings2D.SamplingGroup.PixelOption.All,
    },
    Processing =
    {
        Color =
        {
            Balance =
            {
                Blue = 1.0,
                Green = 1.0,
                Red = 1.0,
            },
            Gamma = 1.0,
            Experimental = { Mode = Zivid.NET.Settings2D.ProcessingGroup.ColorGroup.ExperimentalGroup.ModeOption.Automatic },
        },
    },
};
var settings = new Zivid.NET.Settings()
{
    Engine = Zivid.NET.Settings.EngineOption.Phase,

    RegionOfInterest =
    {
        Box = {
            Enabled = true,
            PointO = new Zivid.NET.PointXYZ{ x = 1000, y = 1000, z = 1000 },
            PointA = new Zivid.NET.PointXYZ{ x = 1000, y = -1000, z = 1000 },
            PointB = new Zivid.NET.PointXYZ{ x = -1000, y = 1000, z = 1000 },
            Extents = new Zivid.NET.Range<double>(-1000, 1000),
        },
        Depth =
        {
            Enabled = true,
            Range = new Zivid.NET.Range<double>(200, 2000),
        },
    },
    Processing =
    {
        Filters =
        {
            Cluster =
            {
                Removal = { Enabled = true, MaxNeighborDistance = 10, MinArea = 100}
            },
            Hole =
            {
                Repair = { Enabled = true, HoleSize = 0.2, Strictness = 1 },
            },
            Noise =
            {
                Removal = { Enabled = true, Threshold = 7.0 },
                Suppression = { Enabled = true },
                Repair = { Enabled = true },
            },
            Outlier =
            {
                Removal = { Enabled = true, Threshold = 5.0 },
            },
            Reflection =
            {
                Removal = { Enabled = true, Mode = ReflectionFilterModeOption.Global },
            },
            Smoothing =
            {
                Gaussian = { Enabled = true, Sigma = 1.5 },
            },
            Experimental =
            {
                ContrastDistortion =
                {
                    Correction = { Enabled = true, Strength = 0.4 },
                    Removal = { Enabled = true, Threshold = 0.5 },
                },
            },
        },
        Resampling = { Mode = Zivid.NET.Settings.ProcessingGroup.ResamplingGroup.ModeOption.Upsample2x2 },
    },
    Diagnostics = { Enabled = false },
};

settings.Color = settings2D;

SetSamplingPixel(ref settings, camera);
Console.WriteLine(settings);
Console.WriteLine("Configuring base acquisition with settings same for all HDR acquisitions:");
var baseAcquisition = new Zivid.NET.Settings.Acquisition { };
Console.WriteLine(baseAcquisition);
var baseAcquisition2D = new Zivid.NET.Settings2D.Acquisition { };

Console.WriteLine("Configuring acquisition settings different for all HDR acquisitions:");
Tuple<double[], Duration[], double[], double[]> exposureValues = GetExposureValues(camera);
double[] aperture = exposureValues.Item1;
Duration[] exposureTime = exposureValues.Item2;
double[] gain = exposureValues.Item3;
double[] brightness = exposureValues.Item4;
for (int i = 0; i < aperture.Length; i++)
{
    Console.WriteLine("Acquisition {0}:", i + 1);
    Console.WriteLine("  Exposure Time: {0}", exposureTime[i].Microseconds);
    Console.WriteLine("  Aperture: {0}", aperture[i]);
    Console.WriteLine("  Gain: {0}", gain[i]);
    Console.WriteLine("  Brightness: {0}", brightness[i]);
    var acquisitionSettings = baseAcquisition.CopyWith(s =>
    {
        s.Aperture = aperture[i];
        s.ExposureTime = exposureTime[i];
        s.Gain = gain[i];
        s.Brightness = brightness[i];
    });
    settings.Acquisitions.Add(acquisitionSettings);
}
var acquisitionSettings2D = baseAcquisition2D.CopyWith(s =>
{
    s.Aperture = 2.83;
    s.ExposureTime = Duration.FromMicroseconds(1000);
    s.Gain = 1.0;
    s.Brightness = 1.8;
});
settings.Color.Acquisitions.Add(acquisitionSettings2D);
跳转到源码

source

print("Configuring settings for capture:")
settings_2d = zivid.Settings2D()

settings_2d.sampling.color = zivid.Settings2D.Sampling.Color.rgb
settings_2d.sampling.pixel = zivid.Settings2D.Sampling.Pixel.all

settings_2d.processing.color.balance.red = 1.0
settings_2d.processing.color.balance.blue = 1.0
settings_2d.processing.color.balance.green = 1.0
settings_2d.processing.color.gamma = 1.0

settings_2d.processing.color.experimental.mode = zivid.Settings2D.Processing.Color.Experimental.Mode.automatic

settings = zivid.Settings()
settings.engine = zivid.Settings.Engine.phase

settings.region_of_interest.box.enabled = True
settings.region_of_interest.box.point_o = [1000, 1000, 1000]
settings.region_of_interest.box.point_a = [1000, -1000, 1000]
settings.region_of_interest.box.point_b = [-1000, 1000, 1000]
settings.region_of_interest.box.extents = [-1000, 1000]

settings.region_of_interest.depth.enabled = True
settings.region_of_interest.depth.range = [200, 2000]

settings.processing.filters.cluster.removal.enabled = True
settings.processing.filters.cluster.removal.max_neighbor_distance = 10
settings.processing.filters.cluster.removal.min_area = 100

settings.processing.filters.hole.repair.enabled = True
settings.processing.filters.hole.repair.hole_size = 0.2
settings.processing.filters.hole.repair.strictness = 1

settings.processing.filters.noise.removal.enabled = True
settings.processing.filters.noise.removal.threshold = 7.0

settings.processing.filters.noise.suppression.enabled = True
settings.processing.filters.noise.repair.enabled = True

settings.processing.filters.outlier.removal.enabled = True
settings.processing.filters.outlier.removal.threshold = 5.0

settings.processing.filters.reflection.removal.enabled = True
settings.processing.filters.reflection.removal.mode = (
    zivid.Settings.Processing.Filters.Reflection.Removal.Mode.global_
)

settings.processing.filters.smoothing.gaussian.enabled = True
settings.processing.filters.smoothing.gaussian.sigma = 1.5

settings.processing.filters.experimental.contrast_distortion.correction.enabled = True
settings.processing.filters.experimental.contrast_distortion.correction.strength = 0.4

settings.processing.filters.experimental.contrast_distortion.removal.enabled = False
settings.processing.filters.experimental.contrast_distortion.removal.threshold = 0.5

settings.processing.resampling.mode = zivid.Settings.Processing.Resampling.Mode.upsample2x2

settings.diagnostics.enabled = False

settings.color = settings_2d

_set_sampling_pixel(settings, camera)
print(settings)
print("Configuring acquisition settings different for all HDR acquisitions")
exposure_values = _get_exposure_values(camera)
for aperture, gain, exposure_time, brightness in exposure_values:
    settings.acquisitions.append(
        zivid.Settings.Acquisition(
            aperture=aperture,
            exposure_time=exposure_time,
            brightness=brightness,
            gain=gain,
        )
    )

settings_2d.acquisitions.append(
    zivid.Settings2D.Acquisition(
        aperture=2.83,
        exposure_time=timedelta(microseconds=10000),
        brightness=1.8,
        gain=1.0,
    )
)

Capture 2D3D

Now we can capture a 2D and 3D image (point cloud with color). Whether there is a single acquisition or multiple acquisitions (HDR) is given by the number of acquisitions in settings.

跳转到源码

source

const auto frame = camera.capture2D3D(settings);
跳转到源码

源码

using (var frame = camera.Capture2D3D(settings))
跳转到源码

source

with camera.capture_2d_3d(settings) as frame:

The Zivid::Frame contains the point cloud, the color image, the capture, and the camera information (all of which are stored on the compute device memory).

The Zivid.NET.Frame contains the point cloud, the color image, the capture, and the camera information (all of which are stored on the compute device memory).

The zivid.Frame contains the point cloud, the color image, the capture, and the camera information (all of which are stored on the compute device memory).

Capture 3D

If we only want to capture 3D, the points cloud without color, we can do so via the capture3D API.

跳转到源码

source

const auto frame3D = camera.capture3D(settings);
跳转到源码

source

using (var frame3D = camera.Capture3D(settings))
跳转到源码

source

with camera.capture_3d(settings) as frame_3d:

捕获2D图像

If we only want to capture a 2D image, which is faster than 3D, we can do so via the capture2D API.

跳转到源码

source

const auto frame2D = camera.capture2D(settings);
跳转到源码

source

using (var frame2D = camera.Capture2D(settings))
跳转到源码

source

with camera.capture_2d(settings) as frame_2d:

保存

我们现在可以保存结果了。

跳转到源码

source

const auto dataFile = "Frame.zdf";
frame.save(dataFile);
跳转到源码

source

var dataFile = "Frame.zdf";
frame.Save(dataFile);
跳转到源码

source

data_file = "Frame.zdf"
frame.save(data_file)

小技巧

您可以在 Zivid Studio 中打开并查看 Frame.zdf 文件。

导出

In the next code example, the point cloud is exported to the .ply format. For other exporting options, see Point Cloud for a list of supported formats.

跳转到源码

source

const auto dataFilePLY = "PointCloud.ply";
frame.save(dataFilePLY);
跳转到源码

source

var dataFilePLY = "PointCloud.ply";
frame.Save(dataFilePLY);
跳转到源码

source

data_file_ply = "PointCloud.ply"
frame.save(data_file_ply)

加载

保存后,可以从 ZDF 文件加载图像帧。

跳转到源码

源码

const auto dataFile = std::string(ZIVID_SAMPLE_DATA_DIR) + "/Zivid3D.zdf";
std::cout << "Reading ZDF frame from file: " << dataFile << std::endl;
const auto frame = Zivid::Frame(dataFile);
跳转到源码

源码

var dataFile =
    Environment.GetFolderPath(Environment.SpecialFolder.CommonApplicationData) + "/Zivid/Zivid3D.zdf";
Console.WriteLine("Reading ZDF frame from file: " + dataFile);
var frame = new Zivid.NET.Frame(dataFile);
跳转到源码

源码

data_file = get_sample_data_path() / "Zivid3D.zdf"
print(f"Reading point cloud from file: {data_file}")
frame = zivid.Frame(data_file)

保存2D图像

We can get the 2D color image from Frame2D, which is part of the Frame object, obtained from capture2D3D().

const auto image2D = frame.frame2D().value().imageBGRA();
var image2D = frame.Frame2D.ImageBGRA();
image_2d = frame.frame_2d().image_bgra()

We can get 2D color image directly from the point cloud. This image will have the same resolution as the point cloud.

const auto pointCloud = frame.pointCloud();
const auto image2DInPointCloudResolution = pointCloud.copyImageRGBA();
var pointCloud = frame.PointCloud;
var image2DInPointCloudResolution = pointCloud.CopyImageRGBA();
point_cloud = frame.point_cloud()
image_2d_in_point_cloud_resolution = point_cloud.copy_image("bgra")

2D captures also produce 2D color images in linear RGB and sRGB color space.

跳转到源码

source

const auto imageRGBA = frame.frame2D().value().imageRGBA();
跳转到源码

source

var imageRGBA = frame.Frame2D.ImageRGBA();
跳转到源码

source

image_bgra = frame.frame_2d().image_rgba()

跳转到源码

source

const auto imageSRGB = frame2D.imageSRGB();
跳转到源码

source

var imageSRGB = frame2D.ImageSRGB();
跳转到源码

source

image_srgb = frame_2d.image_srgb()

Then, we can save the 2D image in linear RGB or sRGB color space.

跳转到源码

源码

const auto imageFile = "ImageRGB.png";
std::cout << "Saving 2D color image (linear RGB color space) to file: " << imageFile << std::endl;
imageRGBA.save(imageFile);
跳转到源码

source

var imageFile = "ImageRGB.png";
Console.WriteLine("Saving 2D color image (linear RGB color space) to file: " + imageFile);
imageRGBA.Save(imageFile);
跳转到源码

source

image_file = "ImageRGBA.png"
print(f"Saving 2D color image (linear RGB color space) to file: {image_file}")
image_bgra.save(image_file)

跳转到源码

source

const auto imageFile = "ImageSRGB.png";
std::cout << "Saving 2D color image (sRGB color space) to file: " << imageFile << std::endl;
imageSRGB.save(imageFile);
跳转到源码

source

var imageFile = "ImageSRGB.png";
Console.WriteLine("Saving 2D color image (sRGB color space) to file: " + imageFile);
imageSRGB.Save(imageFile);
跳转到源码

source

image_file = "ImageSRGB.png"
print(f"Saving 2D color image (sRGB color space) to file: {image_file}")
image_srgb.save(image_file)

多线程

对相机对象的操作是线程安全的,但其它操作(如列出相机和连接相机)应按顺序执行。您可以在 多线程 中找到更多信息。

结论

本教程展示了如何使用Zivid SDK进行相机的连接、配置、捕获和保存文件。