捕获教程

介绍

本教程介绍了如何使用Zivid SDK捕获点云和2D图像。

对于MATLAB,请查看 Zivid Capture Tutorial for MATLAB

小技巧

如果您更喜欢观看视频教程,我们的网络研讨 会 Making 3D captures easy - A tour of Zivid Studio and Zivid SDK (轻松实现 3D 捕获 - Zivid Studio和Zivid SDK之旅)涵盖的内容与捕获教程相同。

先决条件

初始化

调用Zivid SDK中的任何API都需要初始化Zivid application并在程序运行时使其保持活动状态。

备注

Zivid::Application 必须在操作Zivid相机时保持活动状态。这本质上是Zivid的驱动程序。

跳转到源码

源码

Zivid::Application zivid;
跳转到源码

源码

var zivid = new Zivid.NET.Application();
跳转到源码

源码

app = zivid.Application()

连接

现在我们可以连接到相机了。

跳转到源码

源码

auto camera = zivid.connectCamera();
跳转到源码

源码

var camera = zivid.ConnectCamera();
跳转到源码

源码

camera = app.connect_camera()

选定相机

当多台相机连接到同一台计算机时,如果需要在代码中使用某台特定的相机,可以通过该相机的序列号来实现。

auto camera = zivid.connectCamera(Zivid::CameraInfo::SerialNumber{ "2020C0DE" });
var camera = zivid.ConnectCamera(new Zivid.NET.CameraInfo.SerialNumber("2020C0DE"));
camera = app.connect_camera(serial_number="2020C0DE")

备注

Zivid Studio中显示的相机序列号。

您还可以列出所有连接到计算机的相机,并通过以下方式查看它们的序列号

跳转到源码

源码

auto cameras = zivid.cameras();
std::cout << "Found " << cameras.size() << " cameras" << std::endl;
for(auto &camera : cameras)
{
    std::cout << "Camera Info: " << camera.info() << std::endl;
}
跳转到源码

源码

var cameras = zivid.Cameras;
Console.WriteLine("Number of cameras found: {0}", cameras.Count);
foreach (var camera in cameras)
{
    Console.WriteLine("Camera Info: {0}", camera.Info);
}
跳转到源码

源码

cameras = app.cameras()
for camera in cameras:
    print(f"Camera Info:  {camera}")

文件相机

文件相机选项允许您在不访问物理相机的情况下试验SDK。文件相机可以在 Sample Data(示例数据) 中找到,其中有多个文件相机可供选择。每个文件相机都在相应相机型号的主要应用程序之一中演示了一个用例。下面的示例显示了如何使用来自 Sample Data(示例数据) 中的Zivid 2 M70文件相机来创建一个文件相机。

跳转到源码

源码

const auto fileCamera =
    userInput ? fileCameraPath : std::string(ZIVID_SAMPLE_DATA_DIR) + "/FileCameraZivid2M70.zfc";
跳转到源码

源码

fileCamera = Environment.GetFolderPath(Environment.SpecialFolder.CommonApplicationData) + "/Zivid/FileCameraZivid2M70.zfc";
跳转到源码

源码

default=get_sample_data_path() / "FileCameraZivid2M70.zfc",

跳转到源码

源码

auto camera = zivid.createFileCamera(fileCamera);
跳转到源码

源码

var camera = zivid.CreateFileCamera(fileCamera);
跳转到源码

源码

camera = app.create_file_camera(file_camera)

采集设置应当如下所示进行初始化,但您可以自由更改处理设置。

跳转到源码

源码

const auto settings =
    Zivid::Settings{ Zivid::Settings::Acquisitions{ Zivid::Settings::Acquisition{} },
                     Zivid::Settings::Processing::Filters::Smoothing::Gaussian::Enabled::yes,
                     Zivid::Settings::Processing::Filters::Smoothing::Gaussian::Sigma{ 1.5 },
                     Zivid::Settings::Processing::Filters::Reflection::Removal::Enabled::yes,
                     Zivid::Settings::Processing::Filters::Reflection::Removal::Experimental::Mode::global,
                     Zivid::Settings::Processing::Color::Balance::Red{ 1 },
                     Zivid::Settings::Processing::Color::Balance::Green{ 1 },
                     Zivid::Settings::Processing::Color::Balance::Blue{ 1 } };
跳转到源码

源码

var settings = new Zivid.NET.Settings
{
    Acquisitions = { new Zivid.NET.Settings.Acquisition { } },
    Processing = { Filters = { Smoothing = { Gaussian = { Enabled = true, Sigma = 1.5 } },
                               Reflection = { Removal = { Enabled = true, Experimental = { Mode = ReflectionFilterModeOption.Global} } } },
                   Color = { Balance = { Red = 1.0, Green = 1.0, Blue = 1.0 } } }
};
跳转到源码

源码

settings = zivid.Settings()
settings.acquisitions.append(zivid.Settings.Acquisition())
settings.processing.filters.smoothing.gaussian.enabled = True
settings.processing.filters.smoothing.gaussian.sigma = 1
settings.processing.filters.reflection.removal.enabled = True
settings.processing.filters.reflection.removal.experimental.mode = "global"
settings.processing.color.balance.red = 1.0
settings.processing.color.balance.green = 1.0
settings.processing.color.balance.blue = 1.0

您可以在 File Camera(文件相机) 中阅读有关文件相机选项的更多信息。

配置

As with all cameras there are settings that can be configured.

Presets

The recommendation is to use Presets available in Zivid Studio and as .yml files (see 加载 below). Alternatively, you can use our Capture Assistant, or you may configure the settings manually.

捕获助手

如果难以判断如何进行参数设置,可以使用捕获助手来进行配置。Zivid SDK也支持该功能用于帮助配置相机设置。

跳转到源码

源码

const auto suggestSettingsParameters = Zivid::CaptureAssistant::SuggestSettingsParameters{
    Zivid::CaptureAssistant::SuggestSettingsParameters::AmbientLightFrequency::none,
    Zivid::CaptureAssistant::SuggestSettingsParameters::MaxCaptureTime{ std::chrono::milliseconds{ 1200 } }
};

std::cout << "Running Capture Assistant with parameters:\n" << suggestSettingsParameters << std::endl;
auto settings = Zivid::CaptureAssistant::suggestSettings(camera, suggestSettingsParameters);
跳转到源码

源码

var suggestSettingsParameters = new Zivid.NET.CaptureAssistant.SuggestSettingsParameters
{
    AmbientLightFrequency =
        Zivid.NET.CaptureAssistant.SuggestSettingsParameters.AmbientLightFrequencyOption.none,
    MaxCaptureTime = Duration.FromMilliseconds(1200)
};

Console.WriteLine("Running Capture Assistant with parameters:\n{0}", suggestSettingsParameters);
var settings = Zivid.NET.CaptureAssistant.Assistant.SuggestSettings(camera, suggestSettingsParameters);
跳转到源码

源码

suggest_settings_parameters = zivid.capture_assistant.SuggestSettingsParameters(
    max_capture_time=datetime.timedelta(milliseconds=1200),
    ambient_light_frequency=zivid.capture_assistant.SuggestSettingsParameters.AmbientLightFrequency.none,
)

print(f"Running Capture Assistant with parameters: {suggest_settings_parameters}")
settings = zivid.capture_assistant.suggest_settings(camera, suggest_settings_parameters)

使用Capture Assistant只需配置两个参数:

  1. Maximum Capture Time (最大捕获时间 ,以毫秒为单位)。

    1. 最小捕获时间为200毫秒。该设置将仅允许一次采集。

    2. 如果时间允许,该算法将合并多次采集。

    3. 该算法将尝试覆盖场景中尽可能多的动态范围。

    4. 在大多数情况下,设置超过1秒的最大捕获时间将获得良好的结果。

  2. Ambient light compensation环境光补偿

    1. 捕获助手将会设置曝光时间为环境光曝光周期的整数倍。

    2. 日本、美洲、台湾、韩国和菲律宾使用60Hz的电网。

    3. 在世界其它地方通常使用50Hz的电网很常见。

手动配置

另一种选择是手动配置设置。有关每个设置的作用的更多信息,请参阅 相机设置 。请注意,Zivid 2 有一组 标准设置 供用户使用。

单次采集

我们可以为单个采集捕获创建设置。

跳转到源码

源码

const auto settings =
    Zivid::Settings{ Zivid::Settings::Experimental::Engine::phase,
                     Zivid::Settings::Acquisitions{ Zivid::Settings::Acquisition{
                         Zivid::Settings::Acquisition::Aperture{ 5.66 },
                         Zivid::Settings::Acquisition::ExposureTime{ std::chrono::microseconds{ 6500 } } } },
                     Zivid::Settings::Processing::Filters::Outlier::Removal::Enabled::yes,
                     Zivid::Settings::Processing::Filters::Outlier::Removal::Threshold{ 5.0 } };
跳转到源码

源码

var settings = new Zivid.NET.Settings
{
    Acquisitions = { new Zivid.NET.Settings.Acquisition { Aperture = 5.66,
                                                          ExposureTime =
                                                              Duration.FromMicroseconds(6500) } },
    Processing = { Filters = { Outlier = { Removal = { Enabled = true, Threshold = 5.0 } } } }
};
跳转到源码

源码

settings = zivid.Settings()
settings.experimental.engine = "phase"
settings.acquisitions.append(zivid.Settings.Acquisition())
settings.acquisitions[0].aperture = 5.66
settings.acquisitions[0].exposure_time = datetime.timedelta(microseconds=6500)
settings.processing.filters.outlier.removal.enabled = True
settings.processing.filters.outlier.removal.threshold = 5.0

多次采集HDR

我们还可以创建在多帧采集的HDR捕获中使用的设置。

跳转到源码

源码

Zivid::Settings settings;
for(const auto aperture : { 11.31, 5.66, 2.83 })
{
    std::cout << "Adding acquisition with aperture = " << aperture << std::endl;
    const auto acquisitionSettings = Zivid::Settings::Acquisition{
        Zivid::Settings::Acquisition::Aperture{ aperture },
    };
    settings.acquisitions().emplaceBack(acquisitionSettings);
}
跳转到源码

源码

var settings = new Zivid.NET.Settings();
foreach (var aperture in new double[] { 9.57, 4.76, 2.59 })
{
    Console.WriteLine("Adding acquisition with aperture = " + aperture);
    var acquisitionSettings = new Zivid.NET.Settings.Acquisition { Aperture = aperture };
    settings.Acquisitions.Add(acquisitionSettings);
}
跳转到源码

源码

settings = zivid.Settings(acquisitions=[zivid.Settings.Acquisition(aperture=fnum) for fnum in (11.31, 5.66, 2.83)])

下面展示了完整的配置设置。

跳转到源码

source

std::cout << "Configuring settings for capture:" << std::endl;
Zivid::Settings settings{
    Zivid::Settings::Experimental::Engine::phase,
    Zivid::Settings::Sampling::Color::rgb,
    Zivid::Settings::Sampling::Pixel::all,
    Zivid::Settings::RegionOfInterest::Box::Enabled::yes,
    Zivid::Settings::RegionOfInterest::Box::PointO{ 1000, 1000, 1000 },
    Zivid::Settings::RegionOfInterest::Box::PointA{ 1000, -1000, 1000 },
    Zivid::Settings::RegionOfInterest::Box::PointB{ -1000, 1000, 1000 },
    Zivid::Settings::RegionOfInterest::Box::Extents{ -1000, 1000 },
    Zivid::Settings::RegionOfInterest::Depth::Enabled::yes,
    Zivid::Settings::RegionOfInterest::Depth::Range{ 200, 2000 },
    Zivid::Settings::Processing::Filters::Smoothing::Gaussian::Enabled::yes,
    Zivid::Settings::Processing::Filters::Smoothing::Gaussian::Sigma{ 1.5 },
    Zivid::Settings::Processing::Filters::Noise::Removal::Enabled::yes,
    Zivid::Settings::Processing::Filters::Noise::Removal::Threshold{ 7.0 },
    Zivid::Settings::Processing::Filters::Noise::Suppression::Enabled::yes,
    Zivid::Settings::Processing::Filters::Noise::Repair::Enabled::yes,
    Zivid::Settings::Processing::Filters::Outlier::Removal::Enabled::yes,
    Zivid::Settings::Processing::Filters::Outlier::Removal::Threshold{ 5.0 },
    Zivid::Settings::Processing::Filters::Reflection::Removal::Enabled::yes,
    Zivid::Settings::Processing::Filters::Reflection::Removal::Experimental::Mode::global,
    Zivid::Settings::Processing::Filters::Cluster::Removal::Enabled::yes,
    Zivid::Settings::Processing::Filters::Cluster::Removal::MaxNeighborDistance{ 10 },
    Zivid::Settings::Processing::Filters::Cluster::Removal::MinArea{ 100 },
    Zivid::Settings::Processing::Filters::Experimental::ContrastDistortion::Correction::Enabled::yes,
    Zivid::Settings::Processing::Filters::Experimental::ContrastDistortion::Correction::Strength{ 0.4 },
    Zivid::Settings::Processing::Filters::Experimental::ContrastDistortion::Removal::Enabled::no,
    Zivid::Settings::Processing::Filters::Experimental::ContrastDistortion::Removal::Threshold{ 0.5 },
    Zivid::Settings::Processing::Filters::Experimental::HoleFilling::Enabled::yes,
    Zivid::Settings::Processing::Filters::Experimental::HoleFilling::HoleSize{ 0.2 },
    Zivid::Settings::Processing::Filters::Experimental::HoleFilling::Strictness{ 1 },
    Zivid::Settings::Processing::Color::Balance::Red{ 1.0 },
    Zivid::Settings::Processing::Color::Balance::Green{ 1.0 },
    Zivid::Settings::Processing::Color::Balance::Blue{ 1.0 },
    Zivid::Settings::Processing::Color::Gamma{ 1.0 },
    Zivid::Settings::Processing::Color::Experimental::Mode::automatic
};
std::cout << settings << std::endl;

std::cout << "Configuring base acquisition with settings same for all HDR acquisition:" << std::endl;
const auto baseAcquisition = Zivid::Settings::Acquisition{};
std::cout << baseAcquisition << std::endl;

std::cout << "Configuring acquisition settings different for all HDR acquisitions" << std::endl;
auto exposureValues = getExposureValues(camera);
const std::vector<double> aperture = std::get<0>(exposureValues);
const std::vector<double> gain = std::get<1>(exposureValues);
const std::vector<std::chrono::microseconds> exposureTime = std::get<2>(exposureValues);
const std::vector<double> brightness = std::get<3>(exposureValues);
for(size_t i = 0; i < aperture.size(); ++i)
{
    std::cout << "Acquisition " << i + 1 << ":" << std::endl;
    std::cout << "  Exposure Time: " << exposureTime.at(i).count() << std::endl;
    std::cout << "  Aperture: " << aperture.at(i) << std::endl;
    std::cout << "  Gain: " << gain.at(i) << std::endl;
    std::cout << "  Brightness: " << brightness.at(i) << std::endl;
    const auto acquisitionSettings = baseAcquisition.copyWith(
        Zivid::Settings::Acquisition::Aperture{ aperture.at(i) },
        Zivid::Settings::Acquisition::Gain{ gain.at(i) },
        Zivid::Settings::Acquisition::ExposureTime{ exposureTime.at(i) },
        Zivid::Settings::Acquisition::Brightness{ brightness.at(i) });
    settings.acquisitions().emplaceBack(acquisitionSettings);
}
跳转到源码

source

Console.WriteLine("Configuring settings for capture:");
var settings = new Zivid.NET.Settings()
{
    Experimental = { Engine = Zivid.NET.Settings.ExperimentalGroup.EngineOption.Phase },
    Sampling = { Color = Zivid.NET.Settings.SamplingGroup.ColorOption.Rgb, Pixel = Zivid.NET.Settings.SamplingGroup.PixelOption.All },
    RegionOfInterest = { Box = {
                            Enabled = true,
                            PointO = new Zivid.NET.PointXYZ{ x = 1000, y = 1000, z = 1000 },
                            PointA = new Zivid.NET.PointXYZ{ x = 1000, y = -1000, z = 1000 },
                            PointB = new Zivid.NET.PointXYZ{ x = -1000, y = 1000, z = 1000 },
                            Extents = new Zivid.NET.Range<double>(-1000, 1000),
                        },
                        Depth =
                        {
                            Enabled = true,
                            Range = new Zivid.NET.Range<double>(200, 2000),
                        }
    },
    Processing = { Filters = { Smoothing = { Gaussian = { Enabled = true, Sigma = 1.5 } },
                               Noise = { Removal = { Enabled = true, Threshold = 7.0 },
                                         Suppression = { Enabled = true },
                                         Repair ={ Enabled = true } },
                               Outlier = { Removal = { Enabled = true, Threshold = 5.0 } },
                               Reflection = { Removal = { Enabled = true, Experimental = { Mode = ReflectionFilterModeOption.Global} } },
                               Cluster = { Removal = { Enabled = true, MaxNeighborDistance = 10, MinArea = 100} },
                               Experimental = { ContrastDistortion = { Correction = { Enabled = true,
                                                                                      Strength = 0.4 },
                                                                       Removal = { Enabled = true,
                                                                                   Threshold = 0.5 } },
                                                HoleFilling = { Enabled = true,
                                                                HoleSize = 0.2,
                                                                Strictness = 1 } } },
                   Color = { Balance = { Red = 1.0, Green = 1.0, Blue = 1.0 },
                             Gamma = 1.0,
                             Experimental = { Mode = ColorModeOption.Automatic } } }
};
Console.WriteLine(settings);

Console.WriteLine("Configuring base acquisition with settings same for all HDR acquisitions:");
var baseAcquisition = new Zivid.NET.Settings.Acquisition { };
Console.WriteLine(baseAcquisition);

Console.WriteLine("Configuring acquisition settings different for all HDR acquisitions:");
Tuple<double[], Duration[], double[], double[]> exposureValues = GetExposureValues(camera);
double[] aperture = exposureValues.Item1;
Duration[] exposureTime = exposureValues.Item2;
double[] gain = exposureValues.Item3;
double[] brightness = exposureValues.Item4;
for (int i = 0; i < aperture.Length; i++)
{
    Console.WriteLine("Acquisition {0}:", i + 1);
    Console.WriteLine("  Exposure Time: {0}", exposureTime[i].Microseconds);
    Console.WriteLine("  Aperture: {0}", aperture[i]);
    Console.WriteLine("  Gain: {0}", gain[i]);
    Console.WriteLine("  Brightness: {0}", brightness[i]);
    var acquisitionSettings = baseAcquisition.CopyWith(s =>
                                                       {
                                                           s.Aperture = aperture[i];
                                                           s.ExposureTime = exposureTime[i];
                                                           s.Gain = gain[i];
                                                           s.Brightness = brightness[i];
                                                       });
    settings.Acquisitions.Add(acquisitionSettings);
}
跳转到源码

source

print("Configuring settings for capture:")
settings = zivid.Settings()
settings.experimental.engine = "phase"
settings.sampling.color = "rgb"
settings.sampling.pixel = "all"
settings.region_of_interest.box.enabled = True
settings.region_of_interest.box.point_o = [1000, 1000, 1000]
settings.region_of_interest.box.point_a = [1000, -1000, 1000]
settings.region_of_interest.box.point_b = [-1000, 1000, 1000]
settings.region_of_interest.box.extents = [-1000, 1000]
settings.region_of_interest.depth.enabled = True
settings.region_of_interest.depth.range = [200, 2000]
filters = settings.processing.filters
filters.smoothing.gaussian.enabled = True
filters.smoothing.gaussian.sigma = 1.5
filters.noise.removal.enabled = True
filters.noise.removal.threshold = 7.0
filters.noise.suppression.enabled = True
filters.noise.repair.enabled = True
filters.outlier.removal.enabled = True
filters.outlier.removal.threshold = 5.0
filters.reflection.removal.enabled = True
filters.reflection.removal.experimental.mode = "global"
filters.cluster.removal.enabled = True
filters.cluster.removal.max_neighbor_distance = 10
filters.cluster.removal.min_area = 100
filters.experimental.contrast_distortion.correction.enabled = True
filters.experimental.contrast_distortion.correction.strength = 0.4
filters.experimental.contrast_distortion.removal.enabled = False
filters.experimental.contrast_distortion.removal.threshold = 0.5
filters.experimental.hole_filling.enabled = True
filters.experimental.hole_filling.hole_size = 0.2
filters.experimental.hole_filling.strictness = 1
color = settings.processing.color
color.balance.red = 1.0
color.balance.blue = 1.0
color.balance.green = 1.0
color.gamma = 1.0
settings.processing.color.experimental.mode = "automatic"
print(settings)

print("Configuring acquisition settings different for all HDR acquisitions")
exposure_values = _get_exposure_values(camera)
for aperture, gain, exposure_time, brightness in exposure_values:
    settings.acquisitions.append(
        zivid.Settings.Acquisition(
            aperture=aperture,
            exposure_time=exposure_time,
            brightness=brightness,
            gain=gain,
        )
    )

2D设置

您还可以仅捕获2D图像,这比3D捕获要快。 2D设置如下所示。

跳转到源码

源码

const auto settings2D =
    Zivid::Settings2D{ Zivid::Settings2D::Acquisitions{ Zivid::Settings2D::Acquisition{
                           Zivid::Settings2D::Acquisition::ExposureTime{ std::chrono::microseconds{ 30000 } },
                           Zivid::Settings2D::Acquisition::Aperture{ 11.31 },
                           Zivid::Settings2D::Acquisition::Brightness{ 1.80 },
                           Zivid::Settings2D::Acquisition::Gain{ 2.0 } } },
                       Zivid::Settings2D::Processing::Color::Balance::Red{ 1 },
                       Zivid::Settings2D::Processing::Color::Balance::Green{ 1 },
                       Zivid::Settings2D::Processing::Color::Balance::Blue{ 1 } };
跳转到源码

源码

var settings2D = new Zivid.NET.Settings2D
{
    Acquisitions = { new Zivid.NET.Settings2D.Acquisition {
        Aperture = 11.31, ExposureTime = Duration.FromMicroseconds(30000), Gain = 2.0, Brightness = 1.80
    } },
    Processing = { Color = { Balance = { Red = 1.0, Blue = 1.0, Green = 1.0 } } }
};
跳转到源码

源码

settings_2d = zivid.Settings2D()
settings_2d.acquisitions.append(zivid.Settings2D.Acquisition())
settings_2d.acquisitions[0].exposure_time = datetime.timedelta(microseconds=30000)
settings_2d.acquisitions[0].aperture = 11.31
settings_2d.acquisitions[0].brightness = 1.80
settings_2d.acquisitions[0].gain = 2.0
settings_2d.processing.color.balance.red = 1.0
settings_2d.processing.color.balance.green = 1.0
settings_2d.processing.color.balance.blue = 1.0
settings_2d.processing.color.gamma = 1.0

加载

Zivid Studio can store the current settings to .yml files. These can be read and applied in the API. You may find it easier to modify the settings in these (human-readable) yaml-files in your preferred editor. Check out Presets for recommended .yml files tuned for your application.

跳转到源码

source

const auto settingsFile = "Settings.yml";
std::cout << "Loading settings from file: " << settingsFile << std::endl;
const auto settingsFromFile = Zivid::Settings(settingsFile);
跳转到源码

source

var settingsFile = "Settings.yml";
Console.WriteLine("Loading settings from file: " + settingsFile);
var settingsFromFile = new Zivid.NET.Settings(settingsFile);
跳转到源码

source

settings_file = "Settings.yml"
print(f"Loading settings from file: {settings_file}")
settings_from_file = zivid.Settings.load(settings_file)

保存

您还可以将设置保存到.yml文件。

跳转到源码

source

const auto settingsFile = "Settings.yml";
std::cout << "Saving settings to file: " << settingsFile << std::endl;
settings.save(settingsFile);
跳转到源码

source

var settingsFile = "Settings.yml";
Console.WriteLine("Saving settings to file: " + settingsFile);
settings.Save(settingsFile);
跳转到源码

source

settings_file = "Settings.yml"
print(f"Saving settings to file: {settings_file}")
settings.save(settings_file)

小心

Zivid设置文件必须使用.yml文件扩展名(而不是.yaml)。

捕获

现在我们可以捕获3D图像了。进行单次采集还是多次采集 (HDR) 由 settings 中的 acquisitions 的数量决定。

跳转到源码

源码

const auto frame = camera.capture(settings);
跳转到源码

源码

using (var frame = camera.Capture(settings))
跳转到源码

源码

with camera.capture(settings) as frame:

Zivid::Frame 包含了点云和彩色图像(存储在计算设备内存中)以及捕获和相机的信息。

Zivid.NET.Frame 包含了点云和彩色图像(存储在计算设备内存中)以及捕获和相机的信息。

zivid.Frame 包含了点云和彩色图像(存储在计算设备内存中)以及捕获和相机的信息。

加载

保存后,可以从 ZDF 文件加载图像帧。

跳转到源码

源码

const auto dataFile = std::string(ZIVID_SAMPLE_DATA_DIR) + "/Zivid3D.zdf";
std::cout << "Reading ZDF frame from file: " << dataFile << std::endl;
const auto frame = Zivid::Frame(dataFile);
跳转到源码

源码

var dataFile =
    Environment.GetFolderPath(Environment.SpecialFolder.CommonApplicationData) + "/Zivid/Zivid3D.zdf";
Console.WriteLine("Reading ZDF frame from file: " + dataFile);
var frame = new Zivid.NET.Frame(dataFile);
跳转到源码

源码

data_file = get_sample_data_path() / "Zivid3D.zdf"
print(f"Reading point cloud from file: {data_file}")
frame = zivid.Frame(data_file)

本教程稍后会说明如何保存到ZDF文件。

捕获2D图像

如果我们只想捕获2D图像(比3D捕获更快),则可以通过2D API来实现。

跳转到源码

源码

const auto frame2D = camera.capture(settings2D);
跳转到源码

源码

using (var frame2D = camera.Capture(settings2D))
跳转到源码

源码

with camera.capture(settings_2d) as frame_2d:

小心

如果2D捕获设置使用的Brightness(亮度)>0,那么Zivid One+相机在切换捕获模式(2D和3D)时会有时间损失。您可以查看 2D和3D切换的限制 了解更多信息。

保存

我们现在可以保存结果了。

跳转到源码

源码

const auto dataFile = "Frame.zdf";
frame.save(dataFile);
跳转到源码

源码

var dataFile = "Frame.zdf";
frame.Save(dataFile);
跳转到源码

源码

data_file = "Frame.zdf"
frame.save(data_file)

小技巧

您可以在 Zivid Studio 中打开并查看 Frame.zdf 文件。

导出

API检测要使用的格式。请参阅 点云 了解支持的格式列表。例如,我们可以将点云导出为.ply格式。

跳转到源码

源码

const auto dataFilePLY = "PointCloud.ply";
frame.save(dataFilePLY);
跳转到源码

源码

var dataFilePLY = "PointCloud.ply";
frame.Save(dataFilePLY);
跳转到源码

源码

data_file_ply = "PointCloud.ply"
frame.save(data_file_ply)

保存2D图像

我们可以从3D捕获中获得2D彩色图像。

跳转到源码

源码

const auto image = pointCloud.copyImageRGBA();

通过单独的2D捕获也能获得2D彩色图像。

跳转到源码

源码

const auto image = frame2D.imageRGBA();
跳转到源码

源码

var image = frame2D.ImageRGBA();
跳转到源码

源码

image = frame_2d.image_rgba()

然后我们就可以保存2D图像了。

跳转到源码

源码

const auto imageFile = "Image.png";
std::cout << "Saving 2D color image to file: " << imageFile << std::endl;
image.save(imageFile);
跳转到源码

源码

var imageFile = "Image.png";
Console.WriteLine("Saving 2D color image to file: {0}", imageFile);
image.Save(imageFile);
跳转到源码

源码

image_file = "Image.png"
print(f"Saving 2D color image to file: {image_file}")
image.save(image_file)

多线程

对相机对象的操作是线程安全的,但其它操作(如列出相机和连接相机)应按顺序执行。您可以在 多线程 中找到更多信息。

结论

本教程展示了如何使用Zivid SDK进行相机的连接、配置、捕获和保存文件。