捕获教程
介绍
本教程介绍了如何使用Zivid SDK捕获点云和2D图像。
对于MATLAB,请查看 Zivid Capture Tutorial for MATLAB。
小技巧
如果您更喜欢观看视频教程,我们的网络研讨 会 Making 3D captures easy - A tour of Zivid Studio and Zivid SDK (轻松实现 3D 捕获 - Zivid Studio和Zivid SDK之旅)涵盖的内容与捕获教程相同。
先决条件
安装 Zivid软件 。
对于Python:安装 zivid-python
初始化
调用Zivid SDK中的任何API都需要初始化Zivid application并在程序运行时使其保持活动状态。
备注
Zivid::Application
必须在操作Zivid相机时保持活动状态。这本质上是Zivid的驱动程序。
Zivid::Application zivid;
var zivid = new Zivid.NET.Application();
app = zivid.Application()
连接
现在我们可以连接到相机了。
auto camera = zivid.connectCamera();
var camera = zivid.ConnectCamera();
camera = app.connect_camera()
选定相机
当多台相机连接到同一台计算机时,如果需要在代码中使用某台特定的相机,可以通过该相机的序列号来实现。
auto camera = zivid.connectCamera(Zivid::CameraInfo::SerialNumber{ "2020C0DE" });
var camera = zivid.ConnectCamera(new Zivid.NET.CameraInfo.SerialNumber("2020C0DE"));
camera = app.connect_camera(serial_number="2020C0DE")
备注
Zivid Studio中显示的相机序列号。
您还可以列出所有连接到计算机的相机,并通过以下方式查看它们的序列号
auto cameras = zivid.cameras();
std::cout << "Found " << cameras.size() << " cameras" << std::endl;
for(auto &camera : cameras)
{
std::cout << "Camera Info: " << camera.info() << std::endl;
}
var cameras = zivid.Cameras;
Console.WriteLine("Number of cameras found: {0}", cameras.Count);
foreach(var camera in cameras)
{
Console.WriteLine("Camera Info: {0}", camera.Info);
}
cameras = app.cameras()
for camera in cameras:
print(f"Camera Info: {camera}")
档案相机
如果您想在不访问物理相机的情况下测试SDK,那么需要进行细微的更改才能使示例正常工作。
const auto fileCamera = std::string(ZIVID_SAMPLE_DATA_DIR) + "/FileCameraZividOne.zfc";
auto camera = zivid.createFileCamera(fileCamera);
var fileCamera = Environment.GetFolderPath(Environment.SpecialFolder.CommonApplicationData)
+ "/Zivid/FileCameraZividOne.zfc";
var camera = zivid.CreateFileCamera(fileCamera);
file_camera = get_sample_data_path() / "FileCameraZividOne.zfc"
camera = app.create_file_camera(file_camera)
备注
您 从 FileCameraZividOne.zfc
中查看到的点云质量不代表Zivid 3D相机的表现。
配置
与其它所有相机一样,Zivid相机也具备可设置的参数。这些参数可以通过手动设置,或者使用捕获助手进行设置。
捕获助手
如果难以判断如何进行参数设置,可以使用捕获助手来进行配置。Zivid SDK也支持该功能用于帮助配置相机设置。
const auto suggestSettingsParameters = Zivid::CaptureAssistant::SuggestSettingsParameters{
Zivid::CaptureAssistant::SuggestSettingsParameters::AmbientLightFrequency::none,
Zivid::CaptureAssistant::SuggestSettingsParameters::MaxCaptureTime{ std::chrono::milliseconds{ 1200 } }
};
std::cout << "Running Capture Assistant with parameters:\n" << suggestSettingsParameters << std::endl;
auto settings = Zivid::CaptureAssistant::suggestSettings(camera, suggestSettingsParameters);
var suggestSettingsParameters = new Zivid.NET.CaptureAssistant.SuggestSettingsParameters {
AmbientLightFrequency =
Zivid.NET.CaptureAssistant.SuggestSettingsParameters.AmbientLightFrequencyOption.none,
MaxCaptureTime = Duration.FromMilliseconds(1200)
};
Console.WriteLine("Running Capture Assistant with parameters:\n{0}", suggestSettingsParameters);
var settings = Zivid.NET.CaptureAssistant.Assistant.SuggestSettings(camera, suggestSettingsParameters);
suggest_settings_parameters = zivid.capture_assistant.SuggestSettingsParameters(
max_capture_time=datetime.timedelta(milliseconds=1200),
ambient_light_frequency=zivid.capture_assistant.SuggestSettingsParameters.AmbientLightFrequency.none,
)
print(f"Running Capture Assistant with parameters: {suggest_settings_parameters}")
settings = zivid.capture_assistant.suggest_settings(camera, suggest_settings_parameters)
使用Capture Assistant只需配置两个参数:
Maximum Capture Time (最大捕获时间,以毫秒为单位)。
最小捕获时间为200毫秒。该设置将仅允许一次采集。
如果时间允许,该算法将合并多次采集。
该算法将尝试覆盖场景中尽可能多的动态范围。
在大多数情况下,设置超过1秒的最大捕获时间将获得良好的结果。
Ambient light compensation (环境光补偿)
捕获助手将会设置曝光时间为环境光曝光周期的整数倍。
日本、美洲、台湾、韩国和菲律宾使用60Hz的电网。
在世界其它地方通常使用50Hz的电网很常见。
手动配置
另一种选择是手动配置设置。关于每个设置项作用的更多信息,请参阅 相机设置。请注意,Zivid Two有一 组 标准设置 供用户使用。
单次采集
我们可以为单次捕获创建设置。
const auto settings =
Zivid::Settings{ Zivid::Settings::Experimental::Engine::phase,
Zivid::Settings::Acquisitions{ Zivid::Settings::Acquisition{
Zivid::Settings::Acquisition::Aperture{ 5.66 },
Zivid::Settings::Acquisition::ExposureTime{ std::chrono::microseconds{ 6500 } } } },
Zivid::Settings::Processing::Filters::Outlier::Removal::Enabled::yes,
Zivid::Settings::Processing::Filters::Outlier::Removal::Threshold{ 5.0 } };
var settings = new Zivid.NET.Settings {
Acquisitions = { new Zivid.NET.Settings.Acquisition { Aperture = 5.66,
ExposureTime =
Duration.FromMicroseconds(6500) } },
Processing = { Filters = { Outlier = { Removal = { Enabled = true, Threshold = 5.0 } } } }
};
settings = zivid.Settings()
settings.experimental.engine = "phase"
settings.acquisitions.append(zivid.Settings.Acquisition())
settings.acquisitions[0].aperture = 5.66
settings.acquisitions[0].exposure_time = datetime.timedelta(microseconds=6500)
settings.processing.filters.outlier.removal.enabled = True
settings.processing.filters.outlier.removal.threshold = 5.0
多次采集HDR
我们还可以创建用于HDR捕获的设置。
Zivid::Settings settings;
for(const auto aperture : { 11.31, 5.66, 2.83 })
{
std::cout << "Adding acquisition with aperture = " << aperture << std::endl;
const auto acquisitionSettings = Zivid::Settings::Acquisition{
Zivid::Settings::Acquisition::Aperture{ aperture },
};
settings.acquisitions().emplaceBack(acquisitionSettings);
}
var settings = new Zivid.NET.Settings();
foreach(var aperture in new double[] { 9.57, 4.76, 2.59 })
{
Console.WriteLine("Adding acquisition with aperture = " + aperture);
var acquisitionSettings = new Zivid.NET.Settings.Acquisition { Aperture = aperture };
settings.Acquisitions.Add(acquisitionSettings);
}
settings = zivid.Settings(acquisitions=[zivid.Settings.Acquisition(aperture=fnum) for fnum in (11.31, 5.66, 2.83)])
下面展示了完整的配置设置。
std::cout << "Configuring processing settings for capture:" << std::endl;
Zivid::Settings settings{
Zivid::Settings::Experimental::Engine::phase,
Zivid::Settings::Processing::Filters::Smoothing::Gaussian::Enabled::yes,
Zivid::Settings::Processing::Filters::Smoothing::Gaussian::Sigma{ 1.5 },
Zivid::Settings::Processing::Filters::Noise::Removal::Enabled::yes,
Zivid::Settings::Processing::Filters::Noise::Removal::Threshold{ 7.0 },
Zivid::Settings::Processing::Filters::Outlier::Removal::Enabled::yes,
Zivid::Settings::Processing::Filters::Outlier::Removal::Threshold{ 5.0 },
Zivid::Settings::Processing::Filters::Reflection::Removal::Enabled::yes,
Zivid::Settings::Processing::Filters::Reflection::Removal::Experimental::Mode::global,
Zivid::Settings::Processing::Filters::Experimental::ContrastDistortion::Correction::Enabled::yes,
Zivid::Settings::Processing::Filters::Experimental::ContrastDistortion::Correction::Strength{ 0.4 },
Zivid::Settings::Processing::Filters::Experimental::ContrastDistortion::Removal::Enabled::no,
Zivid::Settings::Processing::Filters::Experimental::ContrastDistortion::Removal::Threshold{ 0.5 },
Zivid::Settings::Processing::Color::Balance::Red{ 1.0 },
Zivid::Settings::Processing::Color::Balance::Green{ 1.0 },
Zivid::Settings::Processing::Color::Balance::Blue{ 1.0 },
Zivid::Settings::Processing::Color::Gamma{ 1.0 },
Zivid::Settings::Processing::Color::Experimental::Mode::automatic
};
std::cout << settings.processing() << std::endl;
std::cout << "Configuring base acquisition with settings same for all HDR acquisition:" << std::endl;
const auto baseAcquisition = Zivid::Settings::Acquisition{ Zivid::Settings::Acquisition::Brightness{ 1.8 } };
std::cout << baseAcquisition << std::endl;
std::cout << "Configuring acquisition settings different for all HDR acquisitions" << std::endl;
auto exposureValues = getExposureValues(camera);
const std::vector<double> aperture = std::get<0>(exposureValues);
const std::vector<double> gain = std::get<1>(exposureValues);
const std::vector<size_t> exposureTime = std::get<2>(exposureValues);
for(size_t i = 0; i < aperture.size(); ++i)
{
std::cout << "Acquisition " << i + 1 << ":" << std::endl;
std::cout << " Exposure Time: " << exposureTime.at(i) << std::endl;
std::cout << " Aperture: " << aperture.at(i) << std::endl;
std::cout << " Gain: " << gain.at(i) << std::endl;
const auto acquisitionSettings = baseAcquisition.copyWith(
Zivid::Settings::Acquisition::Aperture{ aperture.at(i) },
Zivid::Settings::Acquisition::Gain{ gain.at(i) },
Zivid::Settings::Acquisition::ExposureTime{ std::chrono::microseconds{ exposureTime.at(i) } });
settings.acquisitions().emplaceBack(acquisitionSettings);
}
Console.WriteLine("Configuring processing settings for capture:");
var settings = new Zivid.NET.Settings() {
Experimental = { Engine = Zivid.NET.Settings.ExperimentalGroup.EngineOption.Phase },
Processing = { Filters = { Smoothing = { Gaussian = { Enabled = true, Sigma = 1.5 } },
Noise = { Removal = { Enabled = true, Threshold = 7.0 } },
Outlier = { Removal = { Enabled = true, Threshold = 5.0 } },
Reflection = { Removal = { Enabled = true, Experimental = { Mode = ReflectionFilterModeOption.Global} } },
Experimental = { ContrastDistortion = { Correction = { Enabled = true,
Strength = 0.4 },
Removal = { Enabled = true,
Threshold = 0.5 } } } },
Color = { Balance = { Red = 1.0, Green = 1.0, Blue = 1.0 },
Gamma = 1.0,
Experimental = { Mode = ColorModeOption.Automatic } } }
};
Console.WriteLine(settings.Processing);
Console.WriteLine("Configuring base acquisition with settings same for all HDR acquisitions:");
var baseAcquisition = new Zivid.NET.Settings.Acquisition { Brightness = 1.8 };
Console.WriteLine(baseAcquisition);
Console.WriteLine("Configuring acquisition settings different for all HDR acquisitions:");
Tuple<double[], int[], double[]> exposureValues = GetExposureValues(camera);
double[] aperture = exposureValues.Item1;
int[] exposureTime = exposureValues.Item2;
double[] gain = exposureValues.Item3;
for(int i = 0; i < aperture.Length; i++)
{
Console.WriteLine("Acquisition {0}:", i + 1);
Console.WriteLine(" Exposure Time: {0}", exposureTime[i]);
Console.WriteLine(" Aperture: {0}", aperture[i]);
Console.WriteLine(" Gain: {0}", gain[i]);
var acquisitionSettings = baseAcquisition.CopyWith(s =>
{
s.Aperture = aperture[i];
s.ExposureTime =
Duration.FromMicroseconds(exposureTime[i]);
s.Gain = gain[i];
});
settings.Acquisitions.Add(acquisitionSettings);
}
print("Configuring processing settings for capture:")
settings = zivid.Settings()
settings.experimental.engine = "phase"
filters = settings.processing.filters
filters.smoothing.gaussian.enabled = True
filters.smoothing.gaussian.sigma = 1.5
filters.noise.removal.enabled = True
filters.noise.removal.threshold = 7.0
filters.outlier.removal.enabled = True
filters.outlier.removal.threshold = 5.0
filters.reflection.removal.enabled = True
filters.reflection.removal.experimental.mode = "global"
filters.experimental.contrast_distortion.correction.enabled = True
filters.experimental.contrast_distortion.correction.strength = 0.4
filters.experimental.contrast_distortion.removal.enabled = False
filters.experimental.contrast_distortion.removal.threshold = 0.5
color = settings.processing.color
color.balance.red = 1.0
color.balance.blue = 1.0
color.balance.green = 1.0
color.gamma = 1.0
settings.processing.color.experimental.mode = "automatic"
print(settings.processing)
print("Configuring acquisition settings different for all HDR acquisitions")
exposure_values = _get_exposure_values(camera)
for (aperture, gain, exposure_time) in exposure_values:
settings.acquisitions.append(
zivid.Settings.Acquisition(
aperture=aperture,
exposure_time=datetime.timedelta(microseconds=exposure_time),
brightness=1.8,
gain=gain,
)
)
2D设置
您还可以仅捕获2D图像,这比3D捕获要快。 2D设置如下所示。
const auto settings2D =
Zivid::Settings2D{ Zivid::Settings2D::Acquisitions{ Zivid::Settings2D::Acquisition{
Zivid::Settings2D::Acquisition::ExposureTime{ std::chrono::microseconds{ 30000 } },
Zivid::Settings2D::Acquisition::Aperture{ 11.31 },
Zivid::Settings2D::Acquisition::Brightness{ 1.80 },
Zivid::Settings2D::Acquisition::Gain{ 2.0 } } },
Zivid::Settings2D::Processing::Color::Balance::Red{ 1 },
Zivid::Settings2D::Processing::Color::Balance::Green{ 1 },
Zivid::Settings2D::Processing::Color::Balance::Blue{ 1 } };
var settings2D = new Zivid.NET.Settings2D {
Acquisitions = { new Zivid.NET.Settings2D.Acquisition {
Aperture = 11.31, ExposureTime = Duration.FromMicroseconds(30000), Gain = 2.0, Brightness = 1.80
} },
Processing = { Color = { Balance = { Red = 1.0, Blue = 1.0, Green = 1.0 } } }
};
settings_2d = zivid.Settings2D()
settings_2d.acquisitions.append(zivid.Settings2D.Acquisition())
settings_2d.acquisitions[0].exposure_time = datetime.timedelta(microseconds=30000)
settings_2d.acquisitions[0].aperture = 11.31
settings_2d.acquisitions[0].brightness = 1.80
settings_2d.acquisitions[0].gain = 2.0
settings_2d.processing.color.balance.red = 1.0
settings_2d.processing.color.balance.green = 1.0
settings_2d.processing.color.balance.blue = 1.0
settings_2d.processing.color.gamma = 1.0
加载
Zivid Studio可以将当前设置存储到.yml文件中。这些可以在API中读取和应用。您可能会发现在首选编辑器中修改这些(人类可读的)yaml文件中的设置会更容易。
const auto settingsFile = "Settings.yml";
std::cout << "Loading settings from file: " << settingsFile << std::endl;
const auto settingsFromFile = Zivid::Settings(settingsFile);
var settingsFile = "Settings.yml";
Console.WriteLine("Loading settings from file: " + settingsFile);
var settingsFromFile = new Zivid.NET.Settings(settingsFile);
settings_file = "Settings.yml"
print(f"Loading settings from file: {settings_file}")
settings_from_file = zivid.Settings.load(settings_file)
保存
您还可以将设置保存到.yml文件。
const auto settingsFile = "Settings.yml";
std::cout << "Saving settings to file: " << settingsFile << std::endl;
settings.save(settingsFile);
var settingsFile = "Settings.yml";
Console.WriteLine("Saving settings to file: " + settingsFile);
settings.Save(settingsFile);
settings_file = "Settings.yml"
print(f"Saving settings to file: {settings_file}")
settings.save(settings_file)
小心
Zivid设置文件必须使用.yml文件扩展名(而不是.yaml)。
捕获
现在我们可以捕获3D图像了。进行单次采集还是多次采集 (HDR) 由 settings
中的 acquisitions
的数量决定。
const auto frame = camera.capture(settings);
using(var frame = camera.Capture(settings))
with camera.capture(settings) as frame:
Zivid::Frame
包含了点云和彩色图像(存储在计算设备内存中)以及捕获和相机的信息。
Zivid.NET.Frame
包含了点云和彩色图像(存储在计算设备内存中)以及捕获和相机的信息。
zivid.Frame
包含了点云和彩色图像(存储在计算设备内存中)以及捕获和相机的信息。
加载
保存后,可以从 ZDF 文件加载图像帧。
const auto dataFile = std::string(ZIVID_SAMPLE_DATA_DIR) + "/Zivid3D.zdf";
std::cout << "Reading ZDF frame from file: " << dataFile << std::endl;
const auto frame = Zivid::Frame(dataFile);
var dataFile =
Environment.GetFolderPath(Environment.SpecialFolder.CommonApplicationData) + "/Zivid/Zivid3D.zdf";
Console.WriteLine("Reading ZDF frame from file: " + dataFile);
var frame = new Zivid.NET.Frame(dataFile);
data_file = get_sample_data_path() / "Zivid3D.zdf"
print(f"Reading point cloud from file: {data_file}")
frame = zivid.Frame(data_file)
本教程稍后会说明如何保存到ZDF文件。
捕获2D图像
如果我们只想捕获2D图像(比3D捕获更快),则可以通过2D API来实现。
const auto frame2D = camera.capture(settings2D);
using(var frame2D = camera.Capture(settings2D))
with camera.capture(settings_2d) as frame_2d:
小心
如果2D捕获设置使用的Brightness(亮度)>0,那么Zivid One+相机在切换捕获模式(2D和3D)时会有时间损失。您可以查看 2D和3D切换的限制 了解更多信息。
保存
我们现在可以保存结果了。
const auto dataFile = "Frame.zdf";
frame.save(dataFile);
var dataFile = "Frame.zdf";
frame.Save(dataFile);
data_file = "Frame.zdf"
frame.save(data_file)
小技巧
您可以在 Zivid Studio 中打开并查看 Frame.zdf
文件。
导出
API检测要使用的格式。请参阅 点云 了解支持的格式列表。例如,我们可以将点云导出为.ply格式。
const auto dataFilePLY = "PointCloud.ply";
frame.save(dataFilePLY);
var dataFilePLY = "PointCloud.ply";
frame.Save(dataFilePLY);
data_file_ply = "PointCloud.ply"
frame.save(data_file_ply)
保存2D图像
我们可以从3D捕获中获得2D彩色图像。
const auto image = frame.pointCloud().copyImageRGBA();
通过单独的2D捕获也能获得2D彩色图像。
const auto image = frame2D.imageRGBA();
var image = frame2D.ImageRGBA();
image = frame_2d.image_rgba()
然后我们就可以保存2D图像了。
const auto imageFile = "Image.png";
std::cout << "Saving 2D color image to file: " << imageFile << std::endl;
image.save(imageFile);
var imageFile = "Image.png";
Console.WriteLine("Saving 2D color image to file: {0}", imageFile);
image.Save(imageFile);
image_file = "Image.png"
print(f"Saving 2D color image to file: {image_file}")
image.save(image_file)
结论
本教程展示了如何使用Zivid SDK进行相机的连接、配置、捕获和保存文件。