捕获教程

介绍

本教程介绍了如何使用Zivid SDK捕获点云和2D图像。

先决条件

初始化

调用Zivid SDK中的任何API都需要初始化Zivid application并在程序运行时使其保持活动状态。

备注

Zivid::Application 必须在操作Zivid相机时保持活动状态。这本质上是Zivid的驱动程序。

跳转到源码

源码

Zivid::Application zivid;
跳转到源码

源码

var zivid = new Zivid.NET.Application();
跳转到源码

源码

app = zivid.Application()

连接

现在我们可以连接到相机了。

跳转到源码

源码

auto camera = zivid.connectCamera();
跳转到源码

源码

var camera = zivid.ConnectCamera();
跳转到源码

源码

camera = app.connect_camera()

选定相机

当多台相机连接到同一台计算机时,如果需要在代码中使用某台特定的相机,可以通过该相机的序列号来实现。

auto camera = zivid.connectCamera(Zivid::CameraInfo::SerialNumber{ "2020C0DE" });
var camera = zivid.ConnectCamera(new Zivid.NET.CameraInfo.SerialNumber("2020C0DE"));
camera = app.connect_camera(serial_number="2020C0DE")

备注

Zivid Studio中显示的相机序列号。

您还可以列出所有连接到计算机的相机,并通过以下方式查看它们的序列号

跳转到源码

源码

auto cameras = zivid.cameras();
std::cout << "Found " << cameras.size() << " cameras" << std::endl;
for(auto &camera : cameras)
{
    std::cout << "Camera Info: " << camera.info() << std::endl;
    std::cout << "Camera State: " << camera.state() << std::endl;
}
跳转到源码

源码

var cameras = zivid.Cameras;
Console.WriteLine("Number of cameras found: {0}", cameras.Count);
foreach (var camera in cameras)
{
    Console.WriteLine("Camera Info: {0}", camera.Info);
    Console.WriteLine("Camera State: {0}", camera.State);
}
跳转到源码

源码

cameras = app.cameras()
for camera in cameras:
    print(f"Camera Info:  {camera.info}")
    print(f"Camera State: {camera.state}")

文件相机

文件相机选项允许您在不访问物理相机的情况下试验SDK。文件相机可以在 Sample Data(示例数据) 中找到,其中有多个文件相机可供选择。每个文件相机都在相应相机型号的主要应用程序之一中演示了一个用例。下面的示例显示了如何使用来自 Sample Data(示例数据) 中的Zivid 2 M70文件相机来创建一个文件相机。

跳转到源码

源码

const auto fileCamera =
    userInput ? fileCameraPath : std::string(ZIVID_SAMPLE_DATA_DIR) + "/FileCameraZivid2PlusMR60.zfc";
跳转到源码

源码

fileCamera = Environment.GetFolderPath(Environment.SpecialFolder.CommonApplicationData) + "/Zivid/FileCameraZivid2PlusMR60.zfc";
跳转到源码

源码

default=get_sample_data_path() / "FileCameraZivid2PlusMR60.zfc",

跳转到源码

源码

auto camera = zivid.createFileCamera(fileCamera);
跳转到源码

源码

var camera = zivid.CreateFileCamera(fileCamera);
跳转到源码

源码

camera = app.create_file_camera(file_camera)

采集设置应当如下所示进行初始化,但您可以自由更改处理设置。

跳转到源码

源码

Zivid::Settings settings{
    Zivid::Settings::Acquisitions{ Zivid::Settings::Acquisition{} },
    Zivid::Settings::Processing::Filters::Smoothing::Gaussian::Enabled::yes,
    Zivid::Settings::Processing::Filters::Smoothing::Gaussian::Sigma{ 1.5 },
    Zivid::Settings::Processing::Filters::Reflection::Removal::Enabled::yes,
    Zivid::Settings::Processing::Filters::Reflection::Removal::Mode::global,
};
Zivid::Settings2D settings2D{ Zivid::Settings2D::Acquisitions{ Zivid::Settings2D::Acquisition{} },
                              Zivid::Settings2D::Processing::Color::Balance::Red{ 1 },
                              Zivid::Settings2D::Processing::Color::Balance::Green{ 1 },
                              Zivid::Settings2D::Processing::Color::Balance::Blue{ 1 } };

settings.color() = Zivid::Settings::Color{ settings2D };
跳转到源码

源码

var settings2D = new Zivid.NET.Settings2D
{
    Acquisitions = { new Zivid.NET.Settings2D.Acquisition { } },
    Processing =
    {
        Color =
        {
            Balance = { Red = 1.0, Green = 1.0, Blue = 1.0 }
        }
    }
};
var settings = new Zivid.NET.Settings
{
    Acquisitions = { new Zivid.NET.Settings.Acquisition { } },
    Processing =
    {
        Filters =
        {
            Smoothing =
            {
                Gaussian = { Enabled = true, Sigma = 1.5 }
            },
            Reflection =
            {
                Removal = { Enabled = true, Mode = ReflectionFilterModeOption.Global}
            }
        }
    }
};
settings.Color = settings2D;
跳转到源码

源码

settings = zivid.Settings()
settings.acquisitions.append(zivid.Settings.Acquisition())
settings.processing.filters.smoothing.gaussian.enabled = True
settings.processing.filters.smoothing.gaussian.sigma = 1
settings.processing.filters.reflection.removal.enabled = True
settings.processing.filters.reflection.removal.mode = "global"

settings_2d = zivid.Settings2D()
settings_2d.acquisitions.append(zivid.Settings2D.Acquisition())
settings_2d.processing.color.balance.blue = 1.0
settings_2d.processing.color.balance.green = 1.0
settings_2d.processing.color.balance.red = 1.0

settings.color = settings_2d

您可以在 File Camera(文件相机) 中阅读有关文件相机选项的更多信息。

配置

与所有相机一样,都有可以配置的设置。

Presets(预设值)

建议使用 Zivid Studio 中提供的 Presets(预设值) 以及 .yml 文件(见下文)。预设参数适用于大多数的情况,因此是一个很好的起点。如有需要,您可以轻松微调设置以获得更好的效果。您可以使用任何文本编辑器编辑 YAML 文件,也可以手动编写设置代码。

加载

您可以从 Zivid Studio 将相机设置导出到 .yml 文件。这些文件可以在 API 中加载和应用。

跳转到源码

源码

const auto settingsFile = "Settings.yml";
std::cout << "Loading settings from file: " << settingsFile << std::endl;
const auto settingsFromFile = Zivid::Settings(settingsFile);
跳转到源码

源码

var settingsFile = "Settings.yml";
Console.WriteLine("Loading settings from file: " + settingsFile);
var settingsFromFile = new Zivid.NET.Settings(settingsFile);
跳转到源码

源码

settings_file = "Settings.yml"
print(f"Loading settings from file: {settings_file}")
settings_from_file = zivid.Settings.load(settings_file)

保存

您还可以将设置保存到.yml文件。

跳转到源码

源码

const auto settingsFile = "Settings.yml";
std::cout << "Saving settings to file: " << settingsFile << std::endl;
settings.save(settingsFile);
跳转到源码

源码

var settingsFile = "Settings.yml";
Console.WriteLine("Saving settings to file: " + settingsFile);
settings.Save(settingsFile);
跳转到源码

源码

settings_file = "Settings.yml"
print(f"Saving settings to file: {settings_file}")
settings.save(settings_file)

手动配置

另一种选择是手动配置设置。有关每个设置的作用的更多信息,请参阅 相机设置 。然后,下一步是 捕获高质量的点云

单次 2D 和 3D 采集 - 默认设置

我们可以为单个采集捕获创建设置。

跳转到源码

源码

const auto settings =
    Zivid::Settings{ Zivid::Settings::Acquisitions{ Zivid::Settings::Acquisition{} },
                     Zivid::Settings::Color{ Zivid::Settings2D{
                         Zivid::Settings2D::Acquisitions{ Zivid::Settings2D::Acquisition{} } } } };
跳转到源码

源码

var settings = new Zivid.NET.Settings
{
    Acquisitions = { new Zivid.NET.Settings.Acquisition { } },
    Color = new Zivid.NET.Settings2D { Acquisitions = { new Zivid.NET.Settings2D.Acquisition { } } }
};
跳转到源码

源码

settings = zivid.Settings(
    acquisitions=[zivid.Settings.Acquisition()],
    color=zivid.Settings2D(acquisitions=[zivid.Settings2D.Acquisition()]),
)

多次采集HDR

我们还可以创建在多帧采集的HDR捕获中使用的设置。

跳转到源码

源码

Zivid::Settings settings;
for(const auto aperture : { 5.66, 4.00, 2.59 })
{
    std::cout << "Adding acquisition with aperture = " << aperture << std::endl;
    const auto acquisitionSettings = Zivid::Settings::Acquisition{
        Zivid::Settings::Acquisition::Aperture{ aperture },
    };
    settings.acquisitions().emplaceBack(acquisitionSettings);
}
跳转到源码

源码

var settings = new Zivid.NET.Settings();
foreach (var aperture in new double[] { 5.66, 4.00, 2.59 })
{
    Console.WriteLine("Adding acquisition with aperture = " + aperture);
    var acquisitionSettings = new Zivid.NET.Settings.Acquisition { Aperture = aperture };
    settings.Acquisitions.Add(acquisitionSettings);
}
settings.Color = new Zivid.NET.Settings2D { Acquisitions = { new Zivid.NET.Settings2D.Acquisition { } } };
跳转到源码

源码

settings = zivid.Settings(acquisitions=[zivid.Settings.Acquisition(aperture=fnum) for fnum in (5.66, 4.00, 2.83)])

下面展示了完整的配置设置。

跳转到源码

源码

std::cout << "Configuring settings for capture:" << std::endl;
Zivid::Settings2D settings2D{
    Zivid::Settings2D::Sampling::Color::rgb,
    Zivid::Settings2D::Sampling::Pixel::all,

    Zivid::Settings2D::Processing::Color::Balance::Blue{ 1.0 },
    Zivid::Settings2D::Processing::Color::Balance::Green{ 1.0 },
    Zivid::Settings2D::Processing::Color::Balance::Red{ 1.0 },
    Zivid::Settings2D::Processing::Color::Gamma{ 1.0 },

    Zivid::Settings2D::Processing::Color::Experimental::Mode::automatic,
};

Zivid::Settings settings{
    Zivid::Settings::Color{ settings2D },

    Zivid::Settings::Engine::phase,

    Zivid::Settings::RegionOfInterest::Box::Enabled::yes,
    Zivid::Settings::RegionOfInterest::Box::PointO{ 1000, 1000, 1000 },
    Zivid::Settings::RegionOfInterest::Box::PointA{ 1000, -1000, 1000 },
    Zivid::Settings::RegionOfInterest::Box::PointB{ -1000, 1000, 1000 },
    Zivid::Settings::RegionOfInterest::Box::Extents{ -1000, 1000 },

    Zivid::Settings::RegionOfInterest::Depth::Enabled::yes,
    Zivid::Settings::RegionOfInterest::Depth::Range{ 200, 2000 },

    Zivid::Settings::Processing::Filters::Cluster::Removal::Enabled::yes,
    Zivid::Settings::Processing::Filters::Cluster::Removal::MaxNeighborDistance{ 10 },
    Zivid::Settings::Processing::Filters::Cluster::Removal::MinArea{ 100 },

    Zivid::Settings::Processing::Filters::Hole::Repair::Enabled::yes,
    Zivid::Settings::Processing::Filters::Hole::Repair::HoleSize{ 0.2 },
    Zivid::Settings::Processing::Filters::Hole::Repair::Strictness{ 1 },

    Zivid::Settings::Processing::Filters::Noise::Removal::Enabled::yes,
    Zivid::Settings::Processing::Filters::Noise::Removal::Threshold{ 7.0 },

    Zivid::Settings::Processing::Filters::Noise::Suppression::Enabled::yes,
    Zivid::Settings::Processing::Filters::Noise::Repair::Enabled::yes,

    Zivid::Settings::Processing::Filters::Outlier::Removal::Enabled::yes,
    Zivid::Settings::Processing::Filters::Outlier::Removal::Threshold{ 5.0 },

    Zivid::Settings::Processing::Filters::Reflection::Removal::Enabled::yes,
    Zivid::Settings::Processing::Filters::Reflection::Removal::Mode::global,

    Zivid::Settings::Processing::Filters::Smoothing::Gaussian::Enabled::yes,
    Zivid::Settings::Processing::Filters::Smoothing::Gaussian::Sigma{ 1.5 },

    Zivid::Settings::Processing::Filters::Experimental::ContrastDistortion::Correction::Enabled::yes,
    Zivid::Settings::Processing::Filters::Experimental::ContrastDistortion::Correction::Strength{ 0.4 },

    Zivid::Settings::Processing::Filters::Experimental::ContrastDistortion::Removal::Enabled::no,
    Zivid::Settings::Processing::Filters::Experimental::ContrastDistortion::Removal::Threshold{ 0.5 },

    Zivid::Settings::Processing::Resampling::Mode::upsample2x2,

    Zivid::Settings::Diagnostics::Enabled::no,
};

setSamplingPixel(settings, camera);
std::cout << settings << std::endl;
std::cout << "Configuring base acquisition with settings same for all HDR acquisition:" << std::endl;
const auto baseAcquisition = Zivid::Settings::Acquisition{};
std::cout << baseAcquisition << std::endl;
const auto baseAquisition2D = Zivid::Settings2D::Acquisition{};

std::cout << "Configuring acquisition settings different for all HDR acquisitions" << std::endl;
auto exposureValues = getExposureValues(camera);
const std::vector<double> aperture = std::get<0>(exposureValues);
const std::vector<double> gain = std::get<1>(exposureValues);
const std::vector<std::chrono::microseconds> exposureTime = std::get<2>(exposureValues);
const std::vector<double> brightness = std::get<3>(exposureValues);
for(size_t i = 0; i < aperture.size(); ++i)
{
    std::cout << "Acquisition " << i + 1 << ":" << std::endl;
    std::cout << "  Exposure Time: " << exposureTime.at(i).count() << std::endl;
    std::cout << "  Aperture: " << aperture.at(i) << std::endl;
    std::cout << "  Gain: " << gain.at(i) << std::endl;
    std::cout << "  Brightness: " << brightness.at(i) << std::endl;
    const auto acquisitionSettings = baseAcquisition.copyWith(
        Zivid::Settings::Acquisition::Aperture{ aperture.at(i) },
        Zivid::Settings::Acquisition::Gain{ gain.at(i) },
        Zivid::Settings::Acquisition::ExposureTime{ exposureTime.at(i) },
        Zivid::Settings::Acquisition::Brightness{ brightness.at(i) });
    settings.acquisitions().emplaceBack(acquisitionSettings);
}
const auto acquisitionSettings2D = baseAquisition2D.copyWith(
    Zivid::Settings2D::Acquisition::Aperture{ 2.83 },
    Zivid::Settings2D::Acquisition::ExposureTime{ microseconds{ 10000 } },
    Zivid::Settings2D::Acquisition::Brightness{ 1.8 },
    Zivid::Settings2D::Acquisition::Gain{ 1.0 });
settings.color().value().acquisitions().emplaceBack(acquisitionSettings2D);
跳转到源码

源码

Console.WriteLine("Configuring settings for capture:");
var settings2D = new Zivid.NET.Settings2D()
{
    Sampling =
    {
        Color = Zivid.NET.Settings2D.SamplingGroup.ColorOption.Rgb,
        Pixel = Zivid.NET.Settings2D.SamplingGroup.PixelOption.All,
    },
    Processing =
    {
        Color =
        {
            Balance =
            {
                Blue = 1.0,
                Green = 1.0,
                Red = 1.0,
            },
            Gamma = 1.0,
            Experimental = { Mode = Zivid.NET.Settings2D.ProcessingGroup.ColorGroup.ExperimentalGroup.ModeOption.Automatic },
        },
    },
};
var settings = new Zivid.NET.Settings()
{
    Engine = Zivid.NET.Settings.EngineOption.Phase,

    RegionOfInterest =
    {
        Box = {
            Enabled = true,
            PointO = new Zivid.NET.PointXYZ{ x = 1000, y = 1000, z = 1000 },
            PointA = new Zivid.NET.PointXYZ{ x = 1000, y = -1000, z = 1000 },
            PointB = new Zivid.NET.PointXYZ{ x = -1000, y = 1000, z = 1000 },
            Extents = new Zivid.NET.Range<double>(-1000, 1000),
        },
        Depth =
        {
            Enabled = true,
            Range = new Zivid.NET.Range<double>(200, 2000),
        },
    },
    Processing =
    {
        Filters =
        {
            Cluster =
            {
                Removal = { Enabled = true, MaxNeighborDistance = 10, MinArea = 100}
            },
            Hole =
            {
                Repair = { Enabled = true, HoleSize = 0.2, Strictness = 1 },
            },
            Noise =
            {
                Removal = { Enabled = true, Threshold = 7.0 },
                Suppression = { Enabled = true },
                Repair = { Enabled = true },
            },
            Outlier =
            {
                Removal = { Enabled = true, Threshold = 5.0 },
            },
            Reflection =
            {
                Removal = { Enabled = true, Mode = ReflectionFilterModeOption.Global },
            },
            Smoothing =
            {
                Gaussian = { Enabled = true, Sigma = 1.5 },
            },
            Experimental =
            {
                ContrastDistortion =
                {
                    Correction = { Enabled = true, Strength = 0.4 },
                    Removal = { Enabled = true, Threshold = 0.5 },
                },
            },
        },
        Resampling = { Mode = Zivid.NET.Settings.ProcessingGroup.ResamplingGroup.ModeOption.Upsample2x2 },
    },
    Diagnostics = { Enabled = false },
};

settings.Color = settings2D;

SetSamplingPixel(ref settings, camera);
Console.WriteLine(settings);
Console.WriteLine("Configuring base acquisition with settings same for all HDR acquisitions:");
var baseAcquisition = new Zivid.NET.Settings.Acquisition { };
Console.WriteLine(baseAcquisition);
var baseAcquisition2D = new Zivid.NET.Settings2D.Acquisition { };

Console.WriteLine("Configuring acquisition settings different for all HDR acquisitions:");
Tuple<double[], Duration[], double[], double[]> exposureValues = GetExposureValues(camera);
double[] aperture = exposureValues.Item1;
Duration[] exposureTime = exposureValues.Item2;
double[] gain = exposureValues.Item3;
double[] brightness = exposureValues.Item4;
for (int i = 0; i < aperture.Length; i++)
{
    Console.WriteLine("Acquisition {0}:", i + 1);
    Console.WriteLine("  Exposure Time: {0}", exposureTime[i].Microseconds);
    Console.WriteLine("  Aperture: {0}", aperture[i]);
    Console.WriteLine("  Gain: {0}", gain[i]);
    Console.WriteLine("  Brightness: {0}", brightness[i]);
    var acquisitionSettings = baseAcquisition.CopyWith(s =>
    {
        s.Aperture = aperture[i];
        s.ExposureTime = exposureTime[i];
        s.Gain = gain[i];
        s.Brightness = brightness[i];
    });
    settings.Acquisitions.Add(acquisitionSettings);
}
var acquisitionSettings2D = baseAcquisition2D.CopyWith(s =>
{
    s.Aperture = 2.83;
    s.ExposureTime = Duration.FromMicroseconds(1000);
    s.Gain = 1.0;
    s.Brightness = 1.8;
});
settings.Color.Acquisitions.Add(acquisitionSettings2D);
跳转到源码

源码

print("Configuring settings for capture:")
settings_2d = zivid.Settings2D()

settings_2d.sampling.color = zivid.Settings2D.Sampling.Color.rgb
settings_2d.sampling.pixel = zivid.Settings2D.Sampling.Pixel.all

settings_2d.processing.color.balance.red = 1.0
settings_2d.processing.color.balance.blue = 1.0
settings_2d.processing.color.balance.green = 1.0
settings_2d.processing.color.gamma = 1.0

settings_2d.processing.color.experimental.mode = zivid.Settings2D.Processing.Color.Experimental.Mode.automatic

settings = zivid.Settings()
settings.engine = zivid.Settings.Engine.phase

settings.region_of_interest.box.enabled = True
settings.region_of_interest.box.point_o = [1000, 1000, 1000]
settings.region_of_interest.box.point_a = [1000, -1000, 1000]
settings.region_of_interest.box.point_b = [-1000, 1000, 1000]
settings.region_of_interest.box.extents = [-1000, 1000]

settings.region_of_interest.depth.enabled = True
settings.region_of_interest.depth.range = [200, 2000]

settings.processing.filters.cluster.removal.enabled = True
settings.processing.filters.cluster.removal.max_neighbor_distance = 10
settings.processing.filters.cluster.removal.min_area = 100

settings.processing.filters.hole.repair.enabled = True
settings.processing.filters.hole.repair.hole_size = 0.2
settings.processing.filters.hole.repair.strictness = 1

settings.processing.filters.noise.removal.enabled = True
settings.processing.filters.noise.removal.threshold = 7.0

settings.processing.filters.noise.suppression.enabled = True
settings.processing.filters.noise.repair.enabled = True

settings.processing.filters.outlier.removal.enabled = True
settings.processing.filters.outlier.removal.threshold = 5.0

settings.processing.filters.reflection.removal.enabled = True
settings.processing.filters.reflection.removal.mode = (
    zivid.Settings.Processing.Filters.Reflection.Removal.Mode.global_
)

settings.processing.filters.smoothing.gaussian.enabled = True
settings.processing.filters.smoothing.gaussian.sigma = 1.5

settings.processing.filters.experimental.contrast_distortion.correction.enabled = True
settings.processing.filters.experimental.contrast_distortion.correction.strength = 0.4

settings.processing.filters.experimental.contrast_distortion.removal.enabled = False
settings.processing.filters.experimental.contrast_distortion.removal.threshold = 0.5

settings.processing.resampling.mode = zivid.Settings.Processing.Resampling.Mode.upsample2x2

settings.diagnostics.enabled = False

settings.color = settings_2d

_set_sampling_pixel(settings, camera)
print(settings)
print("Configuring acquisition settings different for all HDR acquisitions")
exposure_values = _get_exposure_values(camera)
for aperture, gain, exposure_time, brightness in exposure_values:
    settings.acquisitions.append(
        zivid.Settings.Acquisition(
            aperture=aperture,
            exposure_time=exposure_time,
            brightness=brightness,
            gain=gain,
        )
    )

settings_2d.acquisitions.append(
    zivid.Settings2D.Acquisition(
        aperture=2.83,
        exposure_time=timedelta(microseconds=10000),
        brightness=1.8,
        gain=1.0,
    )
)

捕获 2D3D

现在我们可以采集 2D 和 3D 图像(带颜色的点云)了。单次采集还是多次采集(HDR)由 settings 中的 acquisitions 数量决定。

跳转到源码

源码

const auto frame = camera.capture2D3D(settings);
跳转到源码

源码

using (var frame = camera.Capture2D3D(settings))
跳转到源码

source


frame = camera.capture_2d_3d(settings)

Zivid::Frame 包含了点云、彩色图像、捕获和相机信息(所有这些都存储在计算设备内存中)。

Zivid.NET.Frame 包含了点云、彩色图像、捕获和相机信息(所有这些都存储在计算设备内存中)。

zivid.Frame 包含了点云、彩色图像、捕获和相机信息(所有这些都存储在计算设备内存中)。

捕获 3D

如果我们只想捕获 3D ,即没有颜色的点云,我们可以通过 capture3D API 来实现。

跳转到源码

源码

const auto frame3D = camera.capture3D(settings);
跳转到源码

源码

using (var frame3D = camera.Capture3D(settings))
跳转到源码

源码

frame_3d = camera.capture_3d(settings)

捕获2D图像

如果我们只想捕获比 3D 更快的 2D 图像,我们可以通过 capture2D API 来实现。

跳转到源码

源码

const auto frame2D = camera.capture2D(settings);
跳转到源码

源码

using (var frame2D = camera.Capture2D(settings))
跳转到源码

源码

frame_2d = camera.capture_2d(settings)

保存

我们现在可以保存结果了。

跳转到源码

源码

const auto dataFile = "Frame.zdf";
frame.save(dataFile);
跳转到源码

源码

var dataFile = "Frame.zdf";
frame.Save(dataFile);
跳转到源码

source

data_file = "Frame.zdf"
frame.save(data_file)

小技巧

您可以在 Zivid Studio 中打开并查看 Frame.zdf 文件。

导出

在下一个代码示例中,点云将导出为 .ply 格式。有关其他导出选项,请参阅 点云 以获取支持的格式列表。

跳转到源码

源码

const auto dataFilePLY = "PointCloud.ply";
frame.save(dataFilePLY);
跳转到源码

源码

var dataFilePLY = "PointCloud.ply";
frame.Save(dataFilePLY);
跳转到源码

source

data_file_ply = "PointCloud.ply"
frame.save(data_file_ply)

加载

保存后,可以从 ZDF 文件加载图像帧。

跳转到源码

源码

const auto dataFile = std::string(ZIVID_SAMPLE_DATA_DIR) + "/Zivid3D.zdf";
std::cout << "Reading ZDF frame from file: " << dataFile << std::endl;
const auto frame = Zivid::Frame(dataFile);
跳转到源码

源码

var dataFile =
    Environment.GetFolderPath(Environment.SpecialFolder.CommonApplicationData) + "/Zivid/Zivid3D.zdf";
Console.WriteLine("Reading ZDF frame from file: " + dataFile);

using (var frame = new Zivid.NET.Frame(dataFile))
{
跳转到源码

source

data_file = get_sample_data_path() / "Zivid3D.zdf"
print(f"Reading point cloud from file: {data_file}")

frame = zivid.Frame(data_file)

保存2D图像

capture2D() 函数返回一个 Frame2D 对象。2D 图像有两种颜色空间:线性 RGB 和 sRGB。 imageRGBA() 将返回线性 RGB 颜色空间的图像。如果在函数名后附加 _SRGB ,则返回的图像将采用 sRGB 颜色空间。

跳转到源码

源码

const auto imageRGBA = frame2D.imageRGBA();
跳转到源码

源码

var imageRGBA = frame2D.ImageRGBA();
跳转到源码

源码

image_rgba = frame_2d.image_rgba()

跳转到源码

源码

const auto imageSRGB = frame2D.imageRGBA_SRGB();
跳转到源码

源码

var imageSRGB = frame2D.ImageRGBA_SRGB();
跳转到源码

源码

image_srgb = frame_2d.image_rgba_srgb()

然后,我们可以将 2D 图像保存在线性 RGB 或 sRGB 颜色空间中。

跳转到源码

源码

const auto imageFile = "ImageRGBA_linear.png";
std::cout << "Saving 2D color image (Linear RGB) to file: " << imageFile << std::endl;
imageRGBA.save(imageFile);
跳转到源码

源码

var imageFile = "ImageRGBA_linear.png";
Console.WriteLine($"Saving 2D color image (Linear RGB) to file: {imageFile}");
imageRGBA.Save(imageFile);
跳转到源码

源码

image_file = "ImageRGBA_linear.png"
print(f"Saving 2D color image (sRGB color space) to file: {image_file}")
image_rgba.save(image_file)

跳转到源码

源码

const auto imageFile = "ImageRGBA_sRGB.png";
std::cout << "Saving 2D color image (sRGB color space) to file: " << imageFile << std::endl;
imageSRGB.save(imageFile);
跳转到源码

源码

var imageFile = "ImageRGBA_sRGB.png";
Console.WriteLine($"Saving 2D color image (sRGB color space) to file: {imageFile}");
imageSRGB.Save(imageFile);
跳转到源码

源码

image_file = "ImageRGBA_sRGB.png"
print(f"Saving 2D color image (sRGB color space) to file: {image_file}")
image_srgb.save(image_file)

我们可以直接从点云获取 2D 彩色图像。该图像将具有与点云相同的分辨率,并且位于 sRGB 颜色空间中。

const auto pointCloud = frame.pointCloud();
const auto image2DInPointCloudResolution = pointCloud.copyImageRGBA_SRGB();
var pointCloud = frame.PointCloud;
var image2DInPointCloudResolution = pointCloud.CopyImageRGBA_SRGB();
point_cloud = frame.point_cloud()
image_2d_in_point_cloud_resolution = point_cloud.copy_image("bgra_srgb")

我们可以从 Frame2D 获取 2D 彩色图像,它是 Frame 对象的一部分,而该对象是通过 capture2D3D() 获取的。该图像的分辨率由 2D3D 设置中的 2D 设置决定。

const auto image2D = frame.frame2D().value().imageBGRA_SRGB();
var image2D = frame.Frame2D.ImageBGRA_SRGB();
image_2d = frame.frame_2d().image_bgra_srgb()

多线程

对相机对象的操作是线程安全的,但其它操作(如列出相机和连接相机)应按顺序执行。您可以在 多线程 中找到更多信息。

结论

本教程展示了如何使用Zivid SDK进行相机的连接、配置、捕获和保存文件。