捕获教程
介绍
本教程介绍了如何使用Zivid SDK捕获点云和2D图像。
先决条件
安装 Zivid 软件 。
对于Python:安装 zivid-python
初始化
调用Zivid SDK中的任何API都需要初始化Zivid application并在程序运行时使其保持活动状态。
备注
Zivid::Application 必须在操作Zivid相机时保持活动状态。这本质上是Zivid的驱动程序。
连接
现在我们可以连接到相机了。
选定相机
当多台相机连接到同一台计算机时,如果需要在代码中使用某台特定的相机,可以通过该相机的序列号来实现。
auto camera = zivid.connectCamera(Zivid::CameraInfo::SerialNumber{ "2020C0DE" });
var camera = zivid.ConnectCamera(new Zivid.NET.CameraInfo.SerialNumber("2020C0DE"));
camera = app.connect_camera(serial_number="2020C0DE")
备注
Zivid Studio中显示的相机序列号。
您还可以列出所有连接到计算机的相机,并通过以下方式查看它们的序列号
auto cameras = zivid.cameras();
std::cout << "Found " << cameras.size() << " cameras" << std::endl;
for(auto &camera : cameras)
{
std::cout << "Camera Info: " << camera.info() << std::endl;
std::cout << "Camera State: " << camera.state() << std::endl;
}
配置
与所有相机一样,都有可以配置的设置。
Presets(预设值)
建议使用 Zivid Studio 中提供的 Presets(预设值) 以及 .yml 文件(见下文)。预设参数适用于大多数的情况,因此是一个很好的起点。如有需要,您可以轻松微调设置以获得更好的效果。您可以使用任何文本编辑器编辑 YAML 文件,也可以手动编写设置代码。
加载
您可以从 Zivid Studio 将相机设置导出到 .yml 文件。这些文件可以在 API 中加载和应用。
const auto settingsFile = "Settings.yml";
std::cout << "Loading settings from file: " << settingsFile << std::endl;
const auto settingsFromFile = Zivid::Settings(settingsFile);
保存
您还可以将设置保存到.yml文件。
手动配置
另一种选择是手动配置设置。有关每个设置的作用的更多信息,请参阅 相机设置 。然后,下一步是 捕获高质量的点云 。
单次 2D 和 3D 采集 - 默认设置
我们可以为单个采集捕获创建设置。
const auto settings =
Zivid::Settings{ Zivid::Settings::Acquisitions{ Zivid::Settings::Acquisition{} },
Zivid::Settings::Color{ Zivid::Settings2D{
Zivid::Settings2D::Acquisitions{ Zivid::Settings2D::Acquisition{} } } } };
多次采集HDR
We may also create settings with multiple acquisitions for an HDR capture.
using std::chrono::microseconds;
Zivid::Settings settings;
for(const auto exposure : { microseconds{ 1000 }, microseconds{ 10000 } })
{
std::cout << "Adding acquisition with exposure time of " << exposure.count() << " microseconds "
<< std::endl;
const auto acquisitionSettings = Zivid::Settings::Acquisition{
Zivid::Settings::Acquisition::ExposureTime{ exposure },
};
settings.acquisitions().emplaceBack(acquisitionSettings);
}
var settings = new Zivid.NET.Settings();
foreach (var exposure in new Duration[] { Duration.FromMicroseconds(1000), Duration.FromMicroseconds(10000) })
{
Console.WriteLine("Adding acquisition with exposure time of " + exposure.Microseconds + " microseconds");
var acquisitionSettings = new Zivid.NET.Settings.Acquisition { ExposureTime = exposure };
settings.Acquisitions.Add(acquisitionSettings);
}
settings.Color = new Zivid.NET.Settings2D { Acquisitions = { new Zivid.NET.Settings2D.Acquisition { } } };
settings = zivid.Settings()
for exposure in [1000, 10000]:
print(f"Adding acquisition with exposure time of {exposure} microseconds")
settings.acquisitions.append(
zivid.Settings.Acquisition(exposure_time=datetime.timedelta(microseconds=exposure))
)
Fully Configured Settings
2D Settings, such as color balance and gamma, configured manually:
std::cout << "Configuring settings for capture:" << std::endl;
Zivid::Settings2D settings2D{
Zivid::Settings2D::Sampling::Color::rgb,
Zivid::Settings2D::Sampling::Pixel::all,
Zivid::Settings2D::Sampling::Interval::Enabled::no,
Zivid::Settings2D::Sampling::Interval::Duration{ microseconds{ 10000 } },
Zivid::Settings2D::Processing::Color::Balance::Blue{ 1.0 },
Zivid::Settings2D::Processing::Color::Balance::Green{ 1.0 },
Zivid::Settings2D::Processing::Color::Balance::Red{ 1.0 },
Zivid::Settings2D::Processing::Color::Gamma{ 1.0 },
Zivid::Settings2D::Processing::Color::Experimental::Mode::automatic,
};
Console.WriteLine("Configuring settings for capture:");
var settings2D = new Zivid.NET.Settings2D()
{
Sampling =
{
Color = Zivid.NET.Settings2D.SamplingGroup.ColorOption.Rgb,
Pixel = Zivid.NET.Settings2D.SamplingGroup.PixelOption.All,
Interval =
{
Enabled = false,
Duration = Duration.FromMicroseconds(10000),
},
},
Processing =
{
Color =
{
Balance =
{
Blue = 1.0,
Green = 1.0,
Red = 1.0,
},
Gamma = 1.0,
Experimental = { Mode = Zivid.NET.Settings2D.ProcessingGroup.ColorGroup.ExperimentalGroup.ModeOption.Automatic },
},
},
};
print("Configuring settings for capture:")
settings_2d = zivid.Settings2D()
settings_2d.sampling.color = zivid.Settings2D.Sampling.Color.rgb
settings_2d.sampling.pixel = zivid.Settings2D.Sampling.Pixel.all
settings_2d.sampling.interval.enabled = False
settings_2d.sampling.interval.duration = timedelta(microseconds=10000)
settings_2d.processing.color.balance.red = 1.0
settings_2d.processing.color.balance.blue = 1.0
settings_2d.processing.color.balance.green = 1.0
settings_2d.processing.color.gamma = 1.0
settings_2d.processing.color.experimental.mode = zivid.Settings2D.Processing.Color.Experimental.Mode.automatic
Manually configured 3D settings such as engine, region of interest, filter settings and more:
Zivid::Settings settings{
Zivid::Settings::Color{ settings2D },
Zivid::Settings::Engine::stripe,
Zivid::Settings::RegionOfInterest::Box::Enabled::yes,
Zivid::Settings::RegionOfInterest::Box::PointO{ 1000, 1000, 1000 },
Zivid::Settings::RegionOfInterest::Box::PointA{ 1000, -1000, 1000 },
Zivid::Settings::RegionOfInterest::Box::PointB{ -1000, 1000, 1000 },
Zivid::Settings::RegionOfInterest::Box::Extents{ -1000, 1000 },
Zivid::Settings::RegionOfInterest::Depth::Enabled::yes,
Zivid::Settings::RegionOfInterest::Depth::Range{ 200, 2000 },
Zivid::Settings::Processing::Filters::Cluster::Removal::Enabled::yes,
Zivid::Settings::Processing::Filters::Cluster::Removal::MaxNeighborDistance{ 10 },
Zivid::Settings::Processing::Filters::Cluster::Removal::MinArea{ 100 },
Zivid::Settings::Processing::Filters::Hole::Repair::Enabled::yes,
Zivid::Settings::Processing::Filters::Hole::Repair::HoleSize{ 0.2 },
Zivid::Settings::Processing::Filters::Hole::Repair::Strictness{ 1 },
Zivid::Settings::Processing::Filters::Noise::Removal::Enabled::yes,
Zivid::Settings::Processing::Filters::Noise::Removal::Threshold{ 7.0 },
Zivid::Settings::Processing::Filters::Noise::Suppression::Enabled::yes,
Zivid::Settings::Processing::Filters::Noise::Repair::Enabled::yes,
Zivid::Settings::Processing::Filters::Outlier::Removal::Enabled::yes,
Zivid::Settings::Processing::Filters::Outlier::Removal::Threshold{ 5.0 },
Zivid::Settings::Processing::Filters::Reflection::Removal::Enabled::yes,
Zivid::Settings::Processing::Filters::Reflection::Removal::Mode::global,
Zivid::Settings::Processing::Filters::Smoothing::Gaussian::Enabled::yes,
Zivid::Settings::Processing::Filters::Smoothing::Gaussian::Sigma{ 1.5 },
Zivid::Settings::Processing::Filters::Experimental::ContrastDistortion::Correction::Enabled::yes,
Zivid::Settings::Processing::Filters::Experimental::ContrastDistortion::Correction::Strength{ 0.4 },
Zivid::Settings::Processing::Filters::Experimental::ContrastDistortion::Removal::Enabled::no,
Zivid::Settings::Processing::Filters::Experimental::ContrastDistortion::Removal::Threshold{ 0.5 },
Zivid::Settings::Processing::Resampling::Mode::upsample2x2,
Zivid::Settings::Diagnostics::Enabled::no,
};
setSamplingPixel(settings, camera);
std::cout << settings << std::endl;
var settings = new Zivid.NET.Settings()
{
Engine = Zivid.NET.Settings.EngineOption.Stripe,
RegionOfInterest =
{
Box = {
Enabled = true,
PointO = new Zivid.NET.PointXYZ{ x = 1000, y = 1000, z = 1000 },
PointA = new Zivid.NET.PointXYZ{ x = 1000, y = -1000, z = 1000 },
PointB = new Zivid.NET.PointXYZ{ x = -1000, y = 1000, z = 1000 },
Extents = new Zivid.NET.Range<double>(-1000, 1000),
},
Depth =
{
Enabled = true,
Range = new Zivid.NET.Range<double>(200, 2000),
},
},
Processing =
{
Filters =
{
Cluster =
{
Removal = { Enabled = true, MaxNeighborDistance = 10, MinArea = 100}
},
Hole =
{
Repair = { Enabled = true, HoleSize = 0.2, Strictness = 1 },
},
Noise =
{
Removal = { Enabled = true, Threshold = 7.0 },
Suppression = { Enabled = true },
Repair = { Enabled = true },
},
Outlier =
{
Removal = { Enabled = true, Threshold = 5.0 },
},
Reflection =
{
Removal = { Enabled = true, Mode = ReflectionFilterModeOption.Global },
},
Smoothing =
{
Gaussian = { Enabled = true, Sigma = 1.5 },
},
Experimental =
{
ContrastDistortion =
{
Correction = { Enabled = true, Strength = 0.4 },
Removal = { Enabled = true, Threshold = 0.5 },
},
},
},
Resampling = { Mode = Zivid.NET.Settings.ProcessingGroup.ResamplingGroup.ModeOption.Upsample2x2 },
},
Diagnostics = { Enabled = false },
};
settings.Color = settings2D;
SetSamplingPixel(ref settings, camera);
Console.WriteLine(settings);
settings = zivid.Settings()
settings.engine = zivid.Settings.Engine.stripe
settings.region_of_interest.box.enabled = True
settings.region_of_interest.box.point_o = [1000, 1000, 1000]
settings.region_of_interest.box.point_a = [1000, -1000, 1000]
settings.region_of_interest.box.point_b = [-1000, 1000, 1000]
settings.region_of_interest.box.extents = [-1000, 1000]
settings.region_of_interest.depth.enabled = True
settings.region_of_interest.depth.range = [200, 2000]
settings.processing.filters.cluster.removal.enabled = True
settings.processing.filters.cluster.removal.max_neighbor_distance = 10
settings.processing.filters.cluster.removal.min_area = 100
settings.processing.filters.hole.repair.enabled = True
settings.processing.filters.hole.repair.hole_size = 0.2
settings.processing.filters.hole.repair.strictness = 1
settings.processing.filters.noise.removal.enabled = True
settings.processing.filters.noise.removal.threshold = 7.0
settings.processing.filters.noise.suppression.enabled = True
settings.processing.filters.noise.repair.enabled = True
settings.processing.filters.outlier.removal.enabled = True
settings.processing.filters.outlier.removal.threshold = 5.0
settings.processing.filters.reflection.removal.enabled = True
settings.processing.filters.reflection.removal.mode = (
zivid.Settings.Processing.Filters.Reflection.Removal.Mode.global_
)
settings.processing.filters.smoothing.gaussian.enabled = True
settings.processing.filters.smoothing.gaussian.sigma = 1.5
settings.processing.filters.experimental.contrast_distortion.correction.enabled = True
settings.processing.filters.experimental.contrast_distortion.correction.strength = 0.4
settings.processing.filters.experimental.contrast_distortion.removal.enabled = False
settings.processing.filters.experimental.contrast_distortion.removal.threshold = 0.5
settings.processing.resampling.mode = zivid.Settings.Processing.Resampling.Mode.upsample2x2
settings.diagnostics.enabled = False
settings.color = settings_2d
_set_sampling_pixel(settings, camera)
print(settings)
Different values per acquisition are also possible:
std::cout << "Configuring acquisition settings different for all HDR acquisitions" << std::endl;
const auto baseAcquisition = Zivid::Settings::Acquisition{};
std::cout << baseAcquisition << std::endl;
auto exposureValues = getExposureValues(camera);
const std::vector<double> aperture = std::get<0>(exposureValues);
const std::vector<double> gain = std::get<1>(exposureValues);
const std::vector<std::chrono::microseconds> exposureTime = std::get<2>(exposureValues);
const std::vector<double> brightness = std::get<3>(exposureValues);
for(size_t i = 0; i < aperture.size(); ++i)
{
std::cout << "Acquisition " << i + 1 << ":" << std::endl;
std::cout << " Exposure Time: " << exposureTime.at(i).count() << std::endl;
std::cout << " Aperture: " << aperture.at(i) << std::endl;
std::cout << " Gain: " << gain.at(i) << std::endl;
std::cout << " Brightness: " << brightness.at(i) << std::endl;
const auto acquisitionSettings = baseAcquisition.copyWith(
Zivid::Settings::Acquisition::Aperture{ aperture.at(i) },
Zivid::Settings::Acquisition::Gain{ gain.at(i) },
Zivid::Settings::Acquisition::ExposureTime{ exposureTime.at(i) },
Zivid::Settings::Acquisition::Brightness{ brightness.at(i) });
settings.acquisitions().emplaceBack(acquisitionSettings);
}
const auto aquisitionSettings2D = makeSettings2D(camera).acquisitions();
settings.color().value().set(aquisitionSettings2D);
Console.WriteLine("Configuring acquisition settings different for all HDR acquisitions:");
var baseAcquisition = new Zivid.NET.Settings.Acquisition { };
Console.WriteLine(baseAcquisition);
var baseAcquisition2D = new Zivid.NET.Settings2D.Acquisition { };
Tuple<double[], Duration[], double[], double[]> exposureValues = GetExposureValues(camera);
double[] aperture = exposureValues.Item1;
Duration[] exposureTime = exposureValues.Item2;
double[] gain = exposureValues.Item3;
double[] brightness = exposureValues.Item4;
for (int i = 0; i < aperture.Length; i++)
{
Console.WriteLine("Acquisition {0}:", i + 1);
Console.WriteLine(" Exposure Time: {0}", exposureTime[i].Microseconds);
Console.WriteLine(" Aperture: {0}", aperture[i]);
Console.WriteLine(" Gain: {0}", gain[i]);
Console.WriteLine(" Brightness: {0}", brightness[i]);
var acquisitionSettings = baseAcquisition.CopyWith(s =>
{
s.Aperture = aperture[i];
s.ExposureTime = exposureTime[i];
s.Gain = gain[i];
s.Brightness = brightness[i];
});
settings.Acquisitions.Add(acquisitionSettings);
}
var aquisitionSettings2D = MakeSettings2D(camera);
settings.Color.Acquisitions = aquisitionSettings2D.Acquisitions;
print("Configuring acquisition settings different for all HDR acquisitions")
exposure_values = _get_exposure_values(camera)
for aperture, gain, exposure_time, brightness in exposure_values:
settings.acquisitions.append(
zivid.Settings.Acquisition(
aperture=aperture,
exposure_time=exposure_time,
brightness=brightness,
gain=gain,
)
)
acquisition_settings_2d = make_settings_2d(camera).Acquisition()
settings.color.acquisitions.append(acquisition_settings_2d)
捕获 2D3D
现在我们可以采集 2D 和 3D 图像(带颜色的点云)了。单次采集还是多次采集(HDR)由 settings 中的 acquisitions 数量决定。
Zivid::Frame 包含了点云、彩色图像、捕获和相机信息(所有这些都存储在计算设备内存中)。
Zivid.NET.Frame 包含了点云、彩色图像、捕获和相机信息(所有这些都存储在计算设备内存中)。
zivid.Frame 包含了点云、彩色图像、捕获和相机信息(所有这些都存储在计算设备内存中)。
捕获 3D
如果我们只想捕获 3D ,即没有颜色的点云,我们可以通过 capture3D API 来实现。
捕获2D图像
如果我们只想捕获比 3D 更快的 2D 图像,我们可以通过 capture2D API 来实现。
保存
我们现在可以保存结果了。
小技巧
您可以在 Zivid Studio 中打开并查看 Frame.zdf 文件。
导出
在下一个代码示例中,点云将导出为 .ply 格式。有关其他导出选项,请参阅 点云 以获取支持的格式列表。
加载
保存后,可以从 ZDF 文件加载图像帧。
const auto dataFile = std::string(ZIVID_SAMPLE_DATA_DIR) + "/Zivid3D.zdf";
std::cout << "Reading ZDF frame from file: " << dataFile << std::endl;
const auto frame = Zivid::Frame(dataFile);
保存2D图像
capture2D() 函数返回一个 Frame2D 对象。2D 图像有两种颜色空间:线性 RGB 和 sRGB。 imageRGBA() 将返回线性 RGB 颜色空间的图像。如果在函数名后附加 _SRGB ,则返回的图像将采用 sRGB 颜色空间。
然后,我们可以将 2D 图像保存在线性 RGB 或 sRGB 颜色空间中。
![]()
const auto imageFile = "ImageRGBA_linear.png"; std::cout << "Saving 2D color image (Linear RGB) to file: " << imageFile << std::endl; imageRGBA.save(imageFile);
![]()
const auto imageFile = "ImageRGBA_sRGB.png"; std::cout << "Saving 2D color image (sRGB color space) to file: " << imageFile << std::endl; imageSRGB.save(imageFile);
我们可以直接从点云获取 2D 彩色图像。该图像将具有与点云相同的分辨率,并且位于 sRGB 颜色空间中。
const auto pointCloud = frame.pointCloud();
const auto image2DInPointCloudResolution = pointCloud.copyImageRGBA_SRGB();
var pointCloud = frame.PointCloud;
var image2DInPointCloudResolution = pointCloud.CopyImageRGBA_SRGB();
point_cloud = frame.point_cloud()
image_2d_in_point_cloud_resolution = point_cloud.copy_image("bgra_srgb")
我们可以从 Frame2D 获取 2D 彩色图像,它是 Frame 对象的一部分,而该对象是通过 capture2D3D() 获取的。该图像的分辨率由 2D3D 设置中的 2D 设置决定。
const auto image2D = frame.frame2D().value().imageBGRA_SRGB();
var image2D = frame.Frame2D.ImageBGRA_SRGB();
image_2d = frame.frame_2d().image_bgra_srgb()
文件相机
A file camera allows you to experiment with the SDK without access to a physical camera. The file cameras can be found in Sample Data(示例数据) where there are multiple file cameras to choose from.
const auto fileCamera =
userInput ? fileCameraPath : std::string(ZIVID_SAMPLE_DATA_DIR) + "/FileCameraZivid2PlusMR60.zfc";
采集设置应当如下所示进行初始化,但您可以自由更改处理设置。
Zivid::Settings settings{
Zivid::Settings::Acquisitions{ Zivid::Settings::Acquisition{} },
Zivid::Settings::Processing::Filters::Smoothing::Gaussian::Enabled::yes,
Zivid::Settings::Processing::Filters::Smoothing::Gaussian::Sigma{ 1.5 },
Zivid::Settings::Processing::Filters::Reflection::Removal::Enabled::yes,
Zivid::Settings::Processing::Filters::Reflection::Removal::Mode::global,
};
Zivid::Settings2D settings2D{ Zivid::Settings2D::Acquisitions{ Zivid::Settings2D::Acquisition{} },
Zivid::Settings2D::Processing::Color::Balance::Red{ 1 },
Zivid::Settings2D::Processing::Color::Balance::Green{ 1 },
Zivid::Settings2D::Processing::Color::Balance::Blue{ 1 } };
settings.color() = Zivid::Settings::Color{ settings2D };
var settings2D = new Zivid.NET.Settings2D
{
Acquisitions = { new Zivid.NET.Settings2D.Acquisition { } },
Processing =
{
Color =
{
Balance = { Red = 1.0, Green = 1.0, Blue = 1.0 }
}
}
};
var settings = new Zivid.NET.Settings
{
Acquisitions = { new Zivid.NET.Settings.Acquisition { } },
Processing =
{
Filters =
{
Smoothing =
{
Gaussian = { Enabled = true, Sigma = 1.5 }
},
Reflection =
{
Removal = { Enabled = true, Mode = ReflectionFilterModeOption.Global}
}
}
}
};
settings.Color = settings2D;
settings = zivid.Settings()
settings.acquisitions.append(zivid.Settings.Acquisition())
settings.processing.filters.smoothing.gaussian.enabled = True
settings.processing.filters.smoothing.gaussian.sigma = 1
settings.processing.filters.reflection.removal.enabled = True
settings.processing.filters.reflection.removal.mode = "global"
settings_2d = zivid.Settings2D()
settings_2d.acquisitions.append(zivid.Settings2D.Acquisition())
settings_2d.processing.color.balance.blue = 1.0
settings_2d.processing.color.balance.green = 1.0
settings_2d.processing.color.balance.red = 1.0
settings.color = settings_2d
您可以在 File Camera(文件相机) 中阅读有关文件相机选项的更多信息。
多线程
对相机对象的操作是线程安全的,但其它操作(如列出相机和连接相机)应按顺序执行。您可以在 多线程 中找到更多信息。
结论
本教程展示了如何使用Zivid SDK进行相机的连接、配置、捕获和保存文件。