捕获教程
介绍
本教程介绍了如何使用Zivid SDK捕获点云和2D图像。
对于MATLAB,请查看 Zivid Capture Tutorial for MATLAB。
小技巧
如果您更喜欢观看视频教程,我们的网络研讨 会 Making 3D captures easy - A tour of Zivid Studio and Zivid SDK (轻松实现 3D 捕获 - Zivid Studio和Zivid SDK之旅)涵盖的内容与捕获教程相同。
先决条件
安装 Zivid软件 。
对于Python:安装 zivid-python
初始化
调用Zivid SDK中的任何API都需要初始化Zivid application并在程序运行时使其保持活动状态。
备注
Zivid::Application
必须在操作Zivid相机时保持活动状态。这本质上是Zivid的驱动程序。
Zivid::Application zivid;
var zivid = new Zivid.NET.Application();
app = zivid.Application()
连接
现在我们可以连接到相机了。
auto camera = zivid.connectCamera();
var camera = zivid.ConnectCamera();
camera = app.connect_camera()
选定相机
当多台相机连接到同一台计算机时,如果需要在代码中使用某台特定的相机,可以通过该相机的序列号来实现。
auto camera = zivid.connectCamera(Zivid::CameraInfo::SerialNumber{ "2020C0DE" });
var camera = zivid.ConnectCamera(new Zivid.NET.CameraInfo.SerialNumber("2020C0DE"));
camera = app.connect_camera(serial_number="2020C0DE")
备注
Zivid Studio中显示的相机序列号。
您还可以列出所有连接到计算机的相机,并通过以下方式查看它们的序列号
auto cameras = zivid.cameras();
std::cout << "Found " << cameras.size() << " cameras" << std::endl;
for(auto &camera : cameras)
{
std::cout << "Camera Info: " << camera.info() << std::endl;
std::cout << "Camera State: " << camera.state() << std::endl;
}
var cameras = zivid.Cameras;
Console.WriteLine("Number of cameras found: {0}", cameras.Count);
foreach (var camera in cameras)
{
Console.WriteLine("Camera Info: {0}", camera.Info);
Console.WriteLine("Camera State: {0}", camera.State);
}
cameras = app.cameras()
for camera in cameras:
print(f"Camera Info: {camera.info}")
print(f"Camera State: {camera.state}")
文件相机
文件相机选项允许您在不访问物理相机的情况下试验SDK。文件相机可以在 Sample Data(示例数据) 中找到,其中有多个文件相机可供选择。每个文件相机都在相应相机型号的主要应用程序之一中演示了一个用例。下面的示例显示了如何使用来自 Sample Data(示例数据) 中的Zivid 2 M70文件相机来创建一个文件相机。
const auto fileCamera =
userInput ? fileCameraPath : std::string(ZIVID_SAMPLE_DATA_DIR) + "/FileCameraZivid2M70.zfc";
fileCamera = Environment.GetFolderPath(Environment.SpecialFolder.CommonApplicationData) + "/Zivid/FileCameraZivid2M70.zfc";
default=get_sample_data_path() / "FileCameraZivid2M70.zfc",
auto camera = zivid.createFileCamera(fileCamera);
var camera = zivid.CreateFileCamera(fileCamera);
camera = app.create_file_camera(file_camera)
采集设置应当如下所示进行初始化,但您可以自由更改处理设置。
const auto settings = Zivid::Settings{ Zivid::Settings::Acquisitions{ Zivid::Settings::Acquisition{} },
Zivid::Settings::Processing::Filters::Smoothing::Gaussian::Enabled::yes,
Zivid::Settings::Processing::Filters::Smoothing::Gaussian::Sigma{ 1.5 },
Zivid::Settings::Processing::Filters::Reflection::Removal::Enabled::yes,
Zivid::Settings::Processing::Filters::Reflection::Removal::Mode::global,
Zivid::Settings::Processing::Color::Balance::Red{ 1 },
Zivid::Settings::Processing::Color::Balance::Green{ 1 },
Zivid::Settings::Processing::Color::Balance::Blue{ 1 } };
var settings = new Zivid.NET.Settings
{
Acquisitions = { new Zivid.NET.Settings.Acquisition { } },
Processing = { Filters = { Smoothing = { Gaussian = { Enabled = true, Sigma = 1.5 } },
Reflection = { Removal = { Enabled = true, Mode = ReflectionFilterModeOption.Global} } },
Color = { Balance = { Red = 1.0, Green = 1.0, Blue = 1.0 } } }
};
settings = zivid.Settings()
settings.acquisitions.append(zivid.Settings.Acquisition())
settings.processing.filters.smoothing.gaussian.enabled = True
settings.processing.filters.smoothing.gaussian.sigma = 1
settings.processing.filters.reflection.removal.enabled = True
settings.processing.filters.reflection.removal.mode = "global"
settings.processing.color.balance.red = 1.0
settings.processing.color.balance.green = 1.0
settings.processing.color.balance.blue = 1.0
您可以在 File Camera(文件相机) 中阅读有关文件相机选项的更多信息。
配置
与所有相机一样,都有可以配置的设置。
Presets(预设值)
建议使用 Zivid Studio 中提供的 Presets(预设值) 和 .yml 文件(请参阅下面的 加载 )。您也可以使用我们的捕获助手,或者可以手动配置设置。
捕获助手
如果难以判断如何进行参数设置,可以使用捕获助手来进行配置。Zivid SDK也支持该功能用于帮助配置相机设置。
const auto suggestSettingsParameters = Zivid::CaptureAssistant::SuggestSettingsParameters{
Zivid::CaptureAssistant::SuggestSettingsParameters::AmbientLightFrequency::none,
Zivid::CaptureAssistant::SuggestSettingsParameters::MaxCaptureTime{ std::chrono::milliseconds{ 1200 } }
};
std::cout << "Running Capture Assistant with parameters:\n" << suggestSettingsParameters << std::endl;
auto settings = Zivid::CaptureAssistant::suggestSettings(camera, suggestSettingsParameters);
var suggestSettingsParameters = new Zivid.NET.CaptureAssistant.SuggestSettingsParameters
{
AmbientLightFrequency =
Zivid.NET.CaptureAssistant.SuggestSettingsParameters.AmbientLightFrequencyOption.none,
MaxCaptureTime = Duration.FromMilliseconds(1200)
};
Console.WriteLine("Running Capture Assistant with parameters:\n{0}", suggestSettingsParameters);
var settings = Zivid.NET.CaptureAssistant.Assistant.SuggestSettings(camera, suggestSettingsParameters);
suggest_settings_parameters = zivid.capture_assistant.SuggestSettingsParameters(
max_capture_time=datetime.timedelta(milliseconds=1200),
ambient_light_frequency=zivid.capture_assistant.SuggestSettingsParameters.AmbientLightFrequency.none,
)
print(f"Running Capture Assistant with parameters: {suggest_settings_parameters}")
settings = zivid.capture_assistant.suggest_settings(camera, suggest_settings_parameters)
使用Capture Assistant只需配置两个参数:
Maximum Capture Time (最大捕获时间 ,以毫秒为单位)。
最小捕获时间为200毫秒。该设置将仅允许一次采集。
如果时间允许,该算法将合并多次采集。
该算法将尝试覆盖场景中尽可能多的动态范围。
在大多数情况下,设置超过1秒的最大捕获时间将获得良好的结果。
Ambient light compensation (环境光补偿)
捕获助手将会设置曝光时间为环境光曝光周期的整数倍。
日本、美洲、台湾、韩国和菲律宾使用60Hz的电网。
在世界其它地方通常使用50Hz的电网很常见。
手动配置
另一种选择是手动配置设置。有关每个设置的作用的更多信息,请参阅 相机设置 。然后,下一步是 捕获高质量的点云 。
单次采集
我们可以为单个采集捕获创建设置。
const auto settings = Zivid::Settings(Zivid::Settings::Acquisitions{ Zivid::Settings::Acquisition{} });
var settings = new Zivid.NET.Settings
{
Acquisitions = { new Zivid.NET.Settings.Acquisition { } }
};
settings = zivid.Settings()
settings.acquisitions.append(zivid.Settings.Acquisition())
多次采集HDR
我们还可以创建在多帧采集的HDR捕获中使用的设置。
Zivid::Settings settings;
for(const auto aperture : { 11.31, 5.66, 2.83 })
{
std::cout << "Adding acquisition with aperture = " << aperture << std::endl;
const auto acquisitionSettings = Zivid::Settings::Acquisition{
Zivid::Settings::Acquisition::Aperture{ aperture },
};
settings.acquisitions().emplaceBack(acquisitionSettings);
}
var settings = new Zivid.NET.Settings();
foreach (var aperture in new double[] { 9.57, 4.76, 2.59 })
{
Console.WriteLine("Adding acquisition with aperture = " + aperture);
var acquisitionSettings = new Zivid.NET.Settings.Acquisition { Aperture = aperture };
settings.Acquisitions.Add(acquisitionSettings);
}
settings = zivid.Settings(acquisitions=[zivid.Settings.Acquisition(aperture=fnum) for fnum in (11.31, 5.66, 2.83)])
下面展示了完整的配置设置。
std::cout << "Configuring settings for capture:" << std::endl;
Zivid::Settings settings{
Zivid::Settings::Engine::phase,
Zivid::Settings::Sampling::Color::rgb,
Zivid::Settings::Sampling::Pixel::blueSubsample2x2,
Zivid::Settings::RegionOfInterest::Box::Enabled::yes,
Zivid::Settings::RegionOfInterest::Box::PointO{ 1000, 1000, 1000 },
Zivid::Settings::RegionOfInterest::Box::PointA{ 1000, -1000, 1000 },
Zivid::Settings::RegionOfInterest::Box::PointB{ -1000, 1000, 1000 },
Zivid::Settings::RegionOfInterest::Box::Extents{ -1000, 1000 },
Zivid::Settings::RegionOfInterest::Depth::Enabled::yes,
Zivid::Settings::RegionOfInterest::Depth::Range{ 200, 2000 },
Zivid::Settings::Processing::Filters::Smoothing::Gaussian::Enabled::yes,
Zivid::Settings::Processing::Filters::Smoothing::Gaussian::Sigma{ 1.5 },
Zivid::Settings::Processing::Filters::Noise::Removal::Enabled::yes,
Zivid::Settings::Processing::Filters::Noise::Removal::Threshold{ 7.0 },
Zivid::Settings::Processing::Filters::Noise::Suppression::Enabled::yes,
Zivid::Settings::Processing::Filters::Noise::Repair::Enabled::yes,
Zivid::Settings::Processing::Filters::Outlier::Removal::Enabled::yes,
Zivid::Settings::Processing::Filters::Outlier::Removal::Threshold{ 5.0 },
Zivid::Settings::Processing::Filters::Reflection::Removal::Enabled::yes,
Zivid::Settings::Processing::Filters::Reflection::Removal::Mode::global,
Zivid::Settings::Processing::Filters::Cluster::Removal::Enabled::yes,
Zivid::Settings::Processing::Filters::Cluster::Removal::MaxNeighborDistance{ 10 },
Zivid::Settings::Processing::Filters::Cluster::Removal::MinArea{ 100 },
Zivid::Settings::Processing::Filters::Experimental::ContrastDistortion::Correction::Enabled::yes,
Zivid::Settings::Processing::Filters::Experimental::ContrastDistortion::Correction::Strength{ 0.4 },
Zivid::Settings::Processing::Filters::Experimental::ContrastDistortion::Removal::Enabled::no,
Zivid::Settings::Processing::Filters::Experimental::ContrastDistortion::Removal::Threshold{ 0.5 },
Zivid::Settings::Processing::Filters::Hole::Repair::Enabled::yes,
Zivid::Settings::Processing::Filters::Hole::Repair::HoleSize{ 0.2 },
Zivid::Settings::Processing::Filters::Hole::Repair::Strictness{ 1 },
Zivid::Settings::Processing::Resampling::Mode::upsample2x2,
Zivid::Settings::Processing::Color::Balance::Red{ 1.0 },
Zivid::Settings::Processing::Color::Balance::Green{ 1.0 },
Zivid::Settings::Processing::Color::Balance::Blue{ 1.0 },
Zivid::Settings::Processing::Color::Gamma{ 1.0 },
Zivid::Settings::Processing::Color::Experimental::Mode::automatic
};
std::cout << settings << std::endl;
std::cout << "Configuring base acquisition with settings same for all HDR acquisition:" << std::endl;
const auto baseAcquisition = Zivid::Settings::Acquisition{};
std::cout << baseAcquisition << std::endl;
std::cout << "Configuring acquisition settings different for all HDR acquisitions" << std::endl;
auto exposureValues = getExposureValues(camera);
const std::vector<double> aperture = std::get<0>(exposureValues);
const std::vector<double> gain = std::get<1>(exposureValues);
const std::vector<std::chrono::microseconds> exposureTime = std::get<2>(exposureValues);
const std::vector<double> brightness = std::get<3>(exposureValues);
for(size_t i = 0; i < aperture.size(); ++i)
{
std::cout << "Acquisition " << i + 1 << ":" << std::endl;
std::cout << " Exposure Time: " << exposureTime.at(i).count() << std::endl;
std::cout << " Aperture: " << aperture.at(i) << std::endl;
std::cout << " Gain: " << gain.at(i) << std::endl;
std::cout << " Brightness: " << brightness.at(i) << std::endl;
const auto acquisitionSettings = baseAcquisition.copyWith(
Zivid::Settings::Acquisition::Aperture{ aperture.at(i) },
Zivid::Settings::Acquisition::Gain{ gain.at(i) },
Zivid::Settings::Acquisition::ExposureTime{ exposureTime.at(i) },
Zivid::Settings::Acquisition::Brightness{ brightness.at(i) });
settings.acquisitions().emplaceBack(acquisitionSettings);
}
Console.WriteLine("Configuring settings for capture:");
var settings = new Zivid.NET.Settings()
{
Engine = Zivid.NET.Settings.EngineOption.Phase,
Sampling = { Color = Zivid.NET.Settings.SamplingGroup.ColorOption.Rgb, Pixel = Zivid.NET.Settings.SamplingGroup.PixelOption.BlueSubsample2x2 },
RegionOfInterest = { Box = {
Enabled = true,
PointO = new Zivid.NET.PointXYZ{ x = 1000, y = 1000, z = 1000 },
PointA = new Zivid.NET.PointXYZ{ x = 1000, y = -1000, z = 1000 },
PointB = new Zivid.NET.PointXYZ{ x = -1000, y = 1000, z = 1000 },
Extents = new Zivid.NET.Range<double>(-1000, 1000),
},
Depth =
{
Enabled = true,
Range = new Zivid.NET.Range<double>(200, 2000),
}
},
Processing = { Filters = { Smoothing = { Gaussian = { Enabled = true, Sigma = 1.5 } },
Noise = { Removal = { Enabled = true, Threshold = 7.0 },
Suppression = { Enabled = true },
Repair ={ Enabled = true } },
Outlier = { Removal = { Enabled = true, Threshold = 5.0 } },
Reflection = { Removal = { Enabled = true, Mode = ReflectionFilterModeOption.Global} },
Cluster = { Removal = { Enabled = true, MaxNeighborDistance = 10, MinArea = 100} },
Hole = { Repair = { Enabled = true, HoleSize = 0.2, Strictness = 1 } },
Experimental = { ContrastDistortion = { Correction = { Enabled = true,
Strength = 0.4 },
Removal = { Enabled = true,
Threshold = 0.5 } } } },
Resampling = { Mode = Zivid.NET.Settings.ProcessingGroup.ResamplingGroup.ModeOption.Upsample2x2},
Color = { Balance = { Red = 1.0, Green = 1.0, Blue = 1.0 },
Gamma = 1.0,
Experimental = { Mode = ColorModeOption.Automatic } } }
};
Console.WriteLine(settings);
Console.WriteLine("Configuring base acquisition with settings same for all HDR acquisitions:");
var baseAcquisition = new Zivid.NET.Settings.Acquisition { };
Console.WriteLine(baseAcquisition);
Console.WriteLine("Configuring acquisition settings different for all HDR acquisitions:");
Tuple<double[], Duration[], double[], double[]> exposureValues = GetExposureValues(camera);
double[] aperture = exposureValues.Item1;
Duration[] exposureTime = exposureValues.Item2;
double[] gain = exposureValues.Item3;
double[] brightness = exposureValues.Item4;
for (int i = 0; i < aperture.Length; i++)
{
Console.WriteLine("Acquisition {0}:", i + 1);
Console.WriteLine(" Exposure Time: {0}", exposureTime[i].Microseconds);
Console.WriteLine(" Aperture: {0}", aperture[i]);
Console.WriteLine(" Gain: {0}", gain[i]);
Console.WriteLine(" Brightness: {0}", brightness[i]);
var acquisitionSettings = baseAcquisition.CopyWith(s =>
{
s.Aperture = aperture[i];
s.ExposureTime = exposureTime[i];
s.Gain = gain[i];
s.Brightness = brightness[i];
});
settings.Acquisitions.Add(acquisitionSettings);
}
print("Configuring settings for capture:")
settings = zivid.Settings()
settings.engine = zivid.Settings.Engine.phase
settings.sampling.color = zivid.Settings.Sampling.Color.rgb
settings.sampling.pixel = zivid.Settings.Sampling.Pixel.blueSubsample2x2
settings.region_of_interest.box.enabled = True
settings.region_of_interest.box.point_o = [1000, 1000, 1000]
settings.region_of_interest.box.point_a = [1000, -1000, 1000]
settings.region_of_interest.box.point_b = [-1000, 1000, 1000]
settings.region_of_interest.box.extents = [-1000, 1000]
settings.region_of_interest.depth.enabled = True
settings.region_of_interest.depth.range = [200, 2000]
filters = settings.processing.filters
filters.smoothing.gaussian.enabled = True
filters.smoothing.gaussian.sigma = 1.5
filters.noise.removal.enabled = True
filters.noise.removal.threshold = 7.0
filters.noise.suppression.enabled = True
filters.noise.repair.enabled = True
filters.outlier.removal.enabled = True
filters.outlier.removal.threshold = 5.0
filters.reflection.removal.enabled = True
filters.reflection.removal.mode = zivid.Settings.Processing.Filters.Reflection.Removal.Mode.global_
filters.cluster.removal.enabled = True
filters.cluster.removal.max_neighbor_distance = 10
filters.cluster.removal.min_area = 100
filters.experimental.contrast_distortion.correction.enabled = True
filters.experimental.contrast_distortion.correction.strength = 0.4
filters.experimental.contrast_distortion.removal.enabled = False
filters.experimental.contrast_distortion.removal.threshold = 0.5
filters.hole.repair.enabled = True
filters.hole.repair.hole_size = 0.2
filters.hole.repair.strictness = 1
resampling = settings.processing.resampling
resampling.mode = zivid.Settings.Processing.Resampling.Mode.upsample2x2
color = settings.processing.color
color.balance.red = 1.0
color.balance.blue = 1.0
color.balance.green = 1.0
color.gamma = 1.0
settings.processing.color.experimental.mode = zivid.Settings.Processing.Color.Experimental.Mode.automatic
print(settings)
print("Configuring acquisition settings different for all HDR acquisitions")
exposure_values = _get_exposure_values(camera)
for aperture, gain, exposure_time, brightness in exposure_values:
settings.acquisitions.append(
zivid.Settings.Acquisition(
aperture=aperture,
exposure_time=exposure_time,
brightness=brightness,
gain=gain,
)
)
2D设置
您还可以仅捕获2D图像,这比3D捕获要快。 2D设置如下所示。
const auto settings2D =
Zivid::Settings2D{ Zivid::Settings2D::Acquisitions{ Zivid::Settings2D::Acquisition{
Zivid::Settings2D::Acquisition::ExposureTime{ std::chrono::microseconds{ 30000 } },
Zivid::Settings2D::Acquisition::Aperture{ 11.31 },
Zivid::Settings2D::Acquisition::Brightness{ 1.80 },
Zivid::Settings2D::Acquisition::Gain{ 2.0 } } },
Zivid::Settings2D::Processing::Color::Balance::Red{ 1 },
Zivid::Settings2D::Processing::Color::Balance::Green{ 1 },
Zivid::Settings2D::Processing::Color::Balance::Blue{ 1 } };
var settings2D = new Zivid.NET.Settings2D
{
Acquisitions = { new Zivid.NET.Settings2D.Acquisition {
Aperture = 11.31, ExposureTime = Duration.FromMicroseconds(30000), Gain = 2.0, Brightness = 1.80
} },
Processing = { Color = { Balance = { Red = 1.0, Blue = 1.0, Green = 1.0 } } }
};
settings_2d = zivid.Settings2D()
settings_2d.acquisitions.append(zivid.Settings2D.Acquisition())
settings_2d.acquisitions[0].exposure_time = datetime.timedelta(microseconds=30000)
settings_2d.acquisitions[0].aperture = 11.31
settings_2d.acquisitions[0].brightness = 1.80
settings_2d.acquisitions[0].gain = 2.0
settings_2d.processing.color.balance.red = 1.0
settings_2d.processing.color.balance.green = 1.0
settings_2d.processing.color.balance.blue = 1.0
settings_2d.processing.color.gamma = 1.0
加载
Zivid Studio 可以将当前设置存储到 .yml 文件中。这些文件可以在 API 中读取和应用。您也可以在常用的文本编辑器中修改这些(人类可读的)yml 文件中的设置参数,这种修改方式可能更容易些。查看 Presets(预设值) 以获取针对您的应用调整的推荐 .yml 文件。
const auto settingsFile = "Settings.yml";
std::cout << "Loading settings from file: " << settingsFile << std::endl;
const auto settingsFromFile = Zivid::Settings(settingsFile);
var settingsFile = "Settings.yml";
Console.WriteLine("Loading settings from file: " + settingsFile);
var settingsFromFile = new Zivid.NET.Settings(settingsFile);
settings_file = "Settings.yml"
print(f"Loading settings from file: {settings_file}")
settings_from_file = zivid.Settings.load(settings_file)
保存
您还可以将设置保存到.yml文件。
const auto settingsFile = "Settings.yml";
std::cout << "Saving settings to file: " << settingsFile << std::endl;
settings.save(settingsFile);
var settingsFile = "Settings.yml";
Console.WriteLine("Saving settings to file: " + settingsFile);
settings.Save(settingsFile);
settings_file = "Settings.yml"
print(f"Saving settings to file: {settings_file}")
settings.save(settings_file)
小心
Zivid设置文件必须使用.yml文件扩展名(而不是.yaml)。
捕获
现在我们可以捕获3D图像了。进行单次采集还是多次采集 (HDR) 由 settings
中的 acquisitions
的数量决定。
const auto frame = camera.capture(settings);
using (var frame = camera.Capture(settings))
with camera.capture(settings) as frame:
Zivid::Frame
包含了点云和彩色图像(存储在计算设备内存中)以及捕获和相机的信息。
Zivid.NET.Frame
包含了点云和彩色图像(存储在计算设备内存中)以及捕获和相机的信息。
zivid.Frame
包含了点云和彩色图像(存储在计算设备内存中)以及捕获和相机的信息。
加载
保存后,可以从 ZDF 文件加载图像帧。
const auto dataFile = std::string(ZIVID_SAMPLE_DATA_DIR) + "/Zivid3D.zdf";
std::cout << "Reading ZDF frame from file: " << dataFile << std::endl;
const auto frame = Zivid::Frame(dataFile);
var dataFile =
Environment.GetFolderPath(Environment.SpecialFolder.CommonApplicationData) + "/Zivid/Zivid3D.zdf";
Console.WriteLine("Reading ZDF frame from file: " + dataFile);
var frame = new Zivid.NET.Frame(dataFile);
data_file = get_sample_data_path() / "Zivid3D.zdf"
print(f"Reading point cloud from file: {data_file}")
frame = zivid.Frame(data_file)
本教程稍后会说明如何保存到ZDF文件。
捕获2D图像
如果我们只想捕获2D图像(比3D捕获更快),则可以通过2D API来实现。
const auto frame2D = camera.capture(settings2D);
using (var frame2D = camera.Capture(settings2D))
with camera.capture(settings_2d) as frame_2d:
保存
我们现在可以保存结果了。
const auto dataFile = "Frame.zdf";
frame.save(dataFile);
var dataFile = "Frame.zdf";
frame.Save(dataFile);
data_file = "Frame.zdf"
frame.save(data_file)
小技巧
您可以在 Zivid Studio 中打开并查看 Frame.zdf
文件。
导出
API检测要使用的格式。请参阅 点云 了解支持的格式列表。例如,我们可以将点云导出为.ply格式。
const auto dataFilePLY = "PointCloud.ply";
frame.save(dataFilePLY);
var dataFilePLY = "PointCloud.ply";
frame.Save(dataFilePLY);
data_file_ply = "PointCloud.ply"
frame.save(data_file_ply)
保存2D图像
我们可以从3D捕获中获得2D彩色图像。
const auto image = pointCloud.copyImageRGBA();
var image = PointCloud.CopyImageRGBA();
通过单独的2D捕获也能获得2D彩色图像。
const auto image = frame2D.imageRGBA();
var image = frame2D.ImageRGBA();
image = frame_2d.image_rgba()
然后我们就可以保存2D图像了。
const auto imageFile = "Image.png";
std::cout << "Saving 2D color image to file: " << imageFile << std::endl;
image.save(imageFile);
var imageFile = "Image.png";
Console.WriteLine("Saving 2D color image to file: {0}", imageFile);
image.Save(imageFile);
image_file = "Image.png"
print(f"Saving 2D color image to file: {image_file}")
image.save(image_file)
多线程
对相机对象的操作是线程安全的,但其它操作(如列出相机和连接相机)应按顺序执行。您可以在 多线程 中找到更多信息。
结论
本教程展示了如何使用Zivid SDK进行相机的连接、配置、捕获和保存文件。