Capture Tutorial

Introduction

이 튜토리얼에서는 Zivid SDK를 사용하여 포인트 클라우드 및 2D 이미지를 캡처하는 방법을 설명합니다.

Prerequisites

Initialize

Zivid SDK에서 API를 호출하려면 Zivid 애플리케이션을 초기화하고 프로그램이 실행되는 동안 활성 상태를 유지해야 합니다.

참고

Zivid::Application Zivid 카메라를 작동하는 동안 살아 있어야 합니다. 이것은 본질적으로 Zivid 드라이버입니다.

소스로 이동

source

Zivid::Application zivid;
소스로 이동

source

var zivid = new Zivid.NET.Application();
소스로 이동

source

app = zivid.Application()

Connect

이제 카메라에 연결할 수 있습니다.

소스로 이동

source

auto camera = zivid.connectCamera();
소스로 이동

source

var camera = zivid.ConnectCamera();
소스로 이동

source

camera = app.connect_camera()

Specific Camera

때로는 여러 대의 카메라가 동일한 컴퓨터에 연결되어 있지만 코드에서 특정 카메라로 작업해야 할 수도 있습니다. 원하는 카메라의 일련 번호를 제공하면 됩니다.

auto camera = zivid.connectCamera(Zivid::CameraInfo::SerialNumber{ "2020C0DE" });
var camera = zivid.ConnectCamera(new Zivid.NET.CameraInfo.SerialNumber("2020C0DE"));
camera = app.connect_camera(serial_number="2020C0DE")

참고

카메라의 일련 번호는 Zivid Studio에 표시됩니다.

컴퓨터에 연결된 모든 카메라를 나열하고 일련 번호도 확인할 수 있습니다.

소스로 이동

source

auto cameras = zivid.cameras();
std::cout << "Found " << cameras.size() << " cameras" << std::endl;
for(auto &camera : cameras)
{
    std::cout << "Camera Info: " << camera.info() << std::endl;
    std::cout << "Camera State: " << camera.state() << std::endl;
}
소스로 이동

source

var cameras = zivid.Cameras;
Console.WriteLine("Number of cameras found: {0}", cameras.Count);
foreach (var camera in cameras)
{
    Console.WriteLine("Camera Info: {0}", camera.Info);
    Console.WriteLine("Camera State: {0}", camera.State);
}
소스로 이동

source

cameras = app.cameras()
for camera in cameras:
    print(f"Camera Info:  {camera.info}")
    print(f"Camera State: {camera.state}")

File Camera

파일 카메라 옵션을 사용하면 실제 카메라에 액세스하지 않고도 SDK를 테스트할 수 있습니다. 파일 카메라는 각각의 카메라 모델에 해당하는 파일 카메라를 Sample Data 에서 찾을 수 있습니다. 각 파일 카메라는 각 카메라 모델의 주요 애플리케이션 중 하나 내에서 사용 사례를 보여줍니다. 아래 예제는 Sample Data 에서 Zivid 2 M70 파일 카메라를 사용하여 파일 카메라를 생성하는 방법을 보여줍니다.

소스로 이동

소스

const auto fileCamera =
    userInput ? fileCameraPath : std::string(ZIVID_SAMPLE_DATA_DIR) + "/FileCameraZivid2PlusMR60.zfc";
소스로 이동

소스

fileCamera = Environment.GetFolderPath(Environment.SpecialFolder.CommonApplicationData) + "/Zivid/FileCameraZivid2PlusMR60.zfc";
소스로 이동

소스

default=get_sample_data_path() / "FileCameraZivid2PlusMR60.zfc",

소스로 이동

소스

auto camera = zivid.createFileCamera(fileCamera);
소스로 이동

소스

var camera = zivid.CreateFileCamera(fileCamera);
소스로 이동

소스

camera = app.create_file_camera(file_camera)

획득 설정은 아래와 같이 초기화되어야 하지만 처리 설정을 자유롭게 변경할 수 있습니다.

소스로 이동

source

Zivid::Settings settings{
    Zivid::Settings::Acquisitions{ Zivid::Settings::Acquisition{} },
    Zivid::Settings::Processing::Filters::Smoothing::Gaussian::Enabled::yes,
    Zivid::Settings::Processing::Filters::Smoothing::Gaussian::Sigma{ 1.5 },
    Zivid::Settings::Processing::Filters::Reflection::Removal::Enabled::yes,
    Zivid::Settings::Processing::Filters::Reflection::Removal::Mode::global,
};
Zivid::Settings2D settings2D{ Zivid::Settings2D::Acquisitions{ Zivid::Settings2D::Acquisition{} },
                              Zivid::Settings2D::Processing::Color::Balance::Red{ 1 },
                              Zivid::Settings2D::Processing::Color::Balance::Green{ 1 },
                              Zivid::Settings2D::Processing::Color::Balance::Blue{ 1 } };

settings.color() = Zivid::Settings::Color{ settings2D };
소스로 이동

source

var settings2D = new Zivid.NET.Settings2D
{
    Acquisitions = { new Zivid.NET.Settings2D.Acquisition { } },
    Processing =
    {
        Color =
        {
            Balance = { Red = 1.0, Green = 1.0, Blue = 1.0 }
        }
    }
};
var settings = new Zivid.NET.Settings
{
    Acquisitions = { new Zivid.NET.Settings.Acquisition { } },
    Processing =
    {
        Filters =
        {
            Smoothing =
            {
                Gaussian = { Enabled = true, Sigma = 1.5 }
            },
            Reflection =
            {
                Removal = { Enabled = true, Mode = ReflectionFilterModeOption.Global}
            }
        }
    }
};
settings.Color = settings2D;
소스로 이동

source

settings = zivid.Settings()
settings.acquisitions.append(zivid.Settings.Acquisition())
settings.processing.filters.smoothing.gaussian.enabled = True
settings.processing.filters.smoothing.gaussian.sigma = 1
settings.processing.filters.reflection.removal.enabled = True
settings.processing.filters.reflection.removal.mode = "global"

settings_2d = zivid.Settings2D()
settings_2d.acquisitions.append(zivid.Settings2D.Acquisition())
settings_2d.processing.color.balance.blue = 1.0
settings_2d.processing.color.balance.green = 1.0
settings_2d.processing.color.balance.red = 1.0

settings.color = settings_2d

파일 카메라 옵션에 대한 자세한 내용은 File Camera 에서 확인할 수 있습니다.

Configure

모든 카메라와 마찬가지로 구성할 수 있는 설정이 있습니다.

Presets

The recommendation is to use Presets available in Zivid Studio and as .yml files (see below). Presets are designed to work well for most cases right away, making them a great starting point. If needed, you can easily fine-tune the settings for better results. You can edit the YAML files in any text editor or code the settings manually.

Load

You can export camera settings to .yml files from Zivid Studio. These can be loaded and applied in the API.

소스로 이동

source

const auto settingsFile = "Settings.yml";
std::cout << "Loading settings from file: " << settingsFile << std::endl;
const auto settingsFromFile = Zivid::Settings(settingsFile);
소스로 이동

source

var settingsFile = "Settings.yml";
Console.WriteLine("Loading settings from file: " + settingsFile);
var settingsFromFile = new Zivid.NET.Settings(settingsFile);
소스로 이동

source

settings_file = "Settings.yml"
print(f"Loading settings from file: {settings_file}")
settings_from_file = zivid.Settings.load(settings_file)

Save

설정을 .yml 파일의 형태로 저장할 수 있습니다.

소스로 이동

source

const auto settingsFile = "Settings.yml";
std::cout << "Saving settings to file: " << settingsFile << std::endl;
settings.save(settingsFile);
소스로 이동

source

var settingsFile = "Settings.yml";
Console.WriteLine("Saving settings to file: " + settingsFile);
settings.Save(settingsFile);
소스로 이동

source

settings_file = "Settings.yml"
print(f"Saving settings to file: {settings_file}")
settings.save(settings_file)

Manual configuration

Another option is to configure settings manually. For more information about what each settings does, please see Camera Settings. Then, the next step it’s Capturing High Quality Point Clouds

Single 2D and 3D Acquisition - Default settings

단일 획득 캡처에 대한 설정을 만들 수 있습니다.

소스로 이동

source

const auto settings = Zivid::Settings(
    Zivid::Settings::Acquisitions{ Zivid::Settings::Acquisition{} },
    Zivid::Settings::Color(
        Zivid::Settings2D(Zivid::Settings2D::Acquisitions{ Zivid::Settings2D::Acquisition{} })));
소스로 이동

source

var settings = new Zivid.NET.Settings
{
    Acquisitions = { new Zivid.NET.Settings.Acquisition { } },
    Color = new Zivid.NET.Settings2D { Acquisitions = { new Zivid.NET.Settings2D.Acquisition { } } }
};
소스로 이동

source

settings = zivid.Settings(
    acquisitions=[zivid.Settings.Acquisition()],
    color=zivid.Settings2D(acquisitions=[zivid.Settings2D.Acquisition()]),
)

Multi Acquisition HDR

multi-acquisition HDR 캡처에 사용할 설정을 만들 수도 있습니다.

소스로 이동

source

Zivid::Settings settings;
for(const auto aperture : { 9.57, 4.76, 2.59 })
{
    std::cout << "Adding acquisition with aperture = " << aperture << std::endl;
    const auto acquisitionSettings = Zivid::Settings::Acquisition{
        Zivid::Settings::Acquisition::Aperture{ aperture },
    };
    settings.acquisitions().emplaceBack(acquisitionSettings);
}
소스로 이동

source

var settings = new Zivid.NET.Settings();
foreach (var aperture in new double[] { 9.57, 4.76, 2.59 })
{
    Console.WriteLine("Adding acquisition with aperture = " + aperture);
    var acquisitionSettings = new Zivid.NET.Settings.Acquisition { Aperture = aperture };
    settings.Acquisitions.Add(acquisitionSettings);
}
소스로 이동

source

settings = zivid.Settings(acquisitions=[zivid.Settings.Acquisition(aperture=fnum) for fnum in (11.31, 5.66, 2.83)])

완전히 구성된 설정은 아래에 설명되어 있습니다.

소스로 이동

source

std::cout << "Configuring settings for capture:" << std::endl;
Zivid::Settings2D settings2D{
    Zivid::Settings2D::Sampling::Color::rgb,
    Zivid::Settings2D::Sampling::Pixel::all,

    Zivid::Settings2D::Processing::Color::Balance::Blue{ 1.0 },
    Zivid::Settings2D::Processing::Color::Balance::Green{ 1.0 },
    Zivid::Settings2D::Processing::Color::Balance::Red{ 1.0 },
    Zivid::Settings2D::Processing::Color::Gamma{ 1.0 },

    Zivid::Settings2D::Processing::Color::Experimental::Mode::automatic,
};

Zivid::Settings settings{
    Zivid::Settings::Color{ settings2D },

    Zivid::Settings::Engine::phase,

    Zivid::Settings::RegionOfInterest::Box::Enabled::yes,
    Zivid::Settings::RegionOfInterest::Box::PointO{ 1000, 1000, 1000 },
    Zivid::Settings::RegionOfInterest::Box::PointA{ 1000, -1000, 1000 },
    Zivid::Settings::RegionOfInterest::Box::PointB{ -1000, 1000, 1000 },
    Zivid::Settings::RegionOfInterest::Box::Extents{ -1000, 1000 },

    Zivid::Settings::RegionOfInterest::Depth::Enabled::yes,
    Zivid::Settings::RegionOfInterest::Depth::Range{ 200, 2000 },

    Zivid::Settings::Processing::Filters::Cluster::Removal::Enabled::yes,
    Zivid::Settings::Processing::Filters::Cluster::Removal::MaxNeighborDistance{ 10 },
    Zivid::Settings::Processing::Filters::Cluster::Removal::MinArea{ 100 },

    Zivid::Settings::Processing::Filters::Hole::Repair::Enabled::yes,
    Zivid::Settings::Processing::Filters::Hole::Repair::HoleSize{ 0.2 },
    Zivid::Settings::Processing::Filters::Hole::Repair::Strictness{ 1 },

    Zivid::Settings::Processing::Filters::Noise::Removal::Enabled::yes,
    Zivid::Settings::Processing::Filters::Noise::Removal::Threshold{ 7.0 },

    Zivid::Settings::Processing::Filters::Noise::Suppression::Enabled::yes,
    Zivid::Settings::Processing::Filters::Noise::Repair::Enabled::yes,

    Zivid::Settings::Processing::Filters::Outlier::Removal::Enabled::yes,
    Zivid::Settings::Processing::Filters::Outlier::Removal::Threshold{ 5.0 },

    Zivid::Settings::Processing::Filters::Reflection::Removal::Enabled::yes,
    Zivid::Settings::Processing::Filters::Reflection::Removal::Mode::global,

    Zivid::Settings::Processing::Filters::Smoothing::Gaussian::Enabled::yes,
    Zivid::Settings::Processing::Filters::Smoothing::Gaussian::Sigma{ 1.5 },

    Zivid::Settings::Processing::Filters::Experimental::ContrastDistortion::Correction::Enabled::yes,
    Zivid::Settings::Processing::Filters::Experimental::ContrastDistortion::Correction::Strength{ 0.4 },

    Zivid::Settings::Processing::Filters::Experimental::ContrastDistortion::Removal::Enabled::no,
    Zivid::Settings::Processing::Filters::Experimental::ContrastDistortion::Removal::Threshold{ 0.5 },

    Zivid::Settings::Processing::Resampling::Mode::upsample2x2,

    Zivid::Settings::Diagnostics::Enabled::no,
};

setSamplingPixel(settings, camera);
std::cout << settings << std::endl;
std::cout << "Configuring base acquisition with settings same for all HDR acquisition:" << std::endl;
const auto baseAcquisition = Zivid::Settings::Acquisition{};
std::cout << baseAcquisition << std::endl;
const auto baseAquisition2D = Zivid::Settings2D::Acquisition{};

std::cout << "Configuring acquisition settings different for all HDR acquisitions" << std::endl;
auto exposureValues = getExposureValues(camera);
const std::vector<double> aperture = std::get<0>(exposureValues);
const std::vector<double> gain = std::get<1>(exposureValues);
const std::vector<std::chrono::microseconds> exposureTime = std::get<2>(exposureValues);
const std::vector<double> brightness = std::get<3>(exposureValues);
for(size_t i = 0; i < aperture.size(); ++i)
{
    std::cout << "Acquisition " << i + 1 << ":" << std::endl;
    std::cout << "  Exposure Time: " << exposureTime.at(i).count() << std::endl;
    std::cout << "  Aperture: " << aperture.at(i) << std::endl;
    std::cout << "  Gain: " << gain.at(i) << std::endl;
    std::cout << "  Brightness: " << brightness.at(i) << std::endl;
    const auto acquisitionSettings = baseAcquisition.copyWith(
        Zivid::Settings::Acquisition::Aperture{ aperture.at(i) },
        Zivid::Settings::Acquisition::Gain{ gain.at(i) },
        Zivid::Settings::Acquisition::ExposureTime{ exposureTime.at(i) },
        Zivid::Settings::Acquisition::Brightness{ brightness.at(i) });
    settings.acquisitions().emplaceBack(acquisitionSettings);
}
const auto acquisitionSettings2D = baseAquisition2D.copyWith(
    Zivid::Settings2D::Acquisition::Aperture{ 2.83 },
    Zivid::Settings2D::Acquisition::ExposureTime{ microseconds{ 10000 } },
    Zivid::Settings2D::Acquisition::Brightness{ 1.8 },
    Zivid::Settings2D::Acquisition::Gain{ 1.0 });
settings.color().value().acquisitions().emplaceBack(acquisitionSettings2D);
소스로 이동

source

Console.WriteLine("Configuring settings for capture:");
var settings2D = new Zivid.NET.Settings2D()
{
    Sampling =
    {
        Color = Zivid.NET.Settings2D.SamplingGroup.ColorOption.Rgb,
        Pixel = Zivid.NET.Settings2D.SamplingGroup.PixelOption.All,
    },
    Processing =
    {
        Color =
        {
            Balance =
            {
                Blue = 1.0,
                Green = 1.0,
                Red = 1.0,
            },
            Gamma = 1.0,
            Experimental = { Mode = Zivid.NET.Settings2D.ProcessingGroup.ColorGroup.ExperimentalGroup.ModeOption.Automatic },
        },
    },
};
var settings = new Zivid.NET.Settings()
{
    Engine = Zivid.NET.Settings.EngineOption.Phase,

    RegionOfInterest =
    {
        Box = {
            Enabled = true,
            PointO = new Zivid.NET.PointXYZ{ x = 1000, y = 1000, z = 1000 },
            PointA = new Zivid.NET.PointXYZ{ x = 1000, y = -1000, z = 1000 },
            PointB = new Zivid.NET.PointXYZ{ x = -1000, y = 1000, z = 1000 },
            Extents = new Zivid.NET.Range<double>(-1000, 1000),
        },
        Depth =
        {
            Enabled = true,
            Range = new Zivid.NET.Range<double>(200, 2000),
        },
    },
    Processing =
    {
        Filters =
        {
            Cluster =
            {
                Removal = { Enabled = true, MaxNeighborDistance = 10, MinArea = 100}
            },
            Hole =
            {
                Repair = { Enabled = true, HoleSize = 0.2, Strictness = 1 },
            },
            Noise =
            {
                Removal = { Enabled = true, Threshold = 7.0 },
                Suppression = { Enabled = true },
                Repair = { Enabled = true },
            },
            Outlier =
            {
                Removal = { Enabled = true, Threshold = 5.0 },
            },
            Reflection =
            {
                Removal = { Enabled = true, Mode = ReflectionFilterModeOption.Global },
            },
            Smoothing =
            {
                Gaussian = { Enabled = true, Sigma = 1.5 },
            },
            Experimental =
            {
                ContrastDistortion =
                {
                    Correction = { Enabled = true, Strength = 0.4 },
                    Removal = { Enabled = true, Threshold = 0.5 },
                },
            },
        },
        Resampling = { Mode = Zivid.NET.Settings.ProcessingGroup.ResamplingGroup.ModeOption.Upsample2x2 },
    },
    Diagnostics = { Enabled = false },
};

settings.Color = settings2D;

SetSamplingPixel(ref settings, camera);
Console.WriteLine(settings);
Console.WriteLine("Configuring base acquisition with settings same for all HDR acquisitions:");
var baseAcquisition = new Zivid.NET.Settings.Acquisition { };
Console.WriteLine(baseAcquisition);
var baseAcquisition2D = new Zivid.NET.Settings2D.Acquisition { };

Console.WriteLine("Configuring acquisition settings different for all HDR acquisitions:");
Tuple<double[], Duration[], double[], double[]> exposureValues = GetExposureValues(camera);
double[] aperture = exposureValues.Item1;
Duration[] exposureTime = exposureValues.Item2;
double[] gain = exposureValues.Item3;
double[] brightness = exposureValues.Item4;
for (int i = 0; i < aperture.Length; i++)
{
    Console.WriteLine("Acquisition {0}:", i + 1);
    Console.WriteLine("  Exposure Time: {0}", exposureTime[i].Microseconds);
    Console.WriteLine("  Aperture: {0}", aperture[i]);
    Console.WriteLine("  Gain: {0}", gain[i]);
    Console.WriteLine("  Brightness: {0}", brightness[i]);
    var acquisitionSettings = baseAcquisition.CopyWith(s =>
    {
        s.Aperture = aperture[i];
        s.ExposureTime = exposureTime[i];
        s.Gain = gain[i];
        s.Brightness = brightness[i];
    });
    settings.Acquisitions.Add(acquisitionSettings);
}
var acquisitionSettings2D = baseAcquisition2D.CopyWith(s =>
{
    s.Aperture = 2.83;
    s.ExposureTime = Duration.FromMicroseconds(1000);
    s.Gain = 1.0;
    s.Brightness = 1.8;
});
settings.Color.Acquisitions.Add(acquisitionSettings2D);
소스로 이동

source

print("Configuring settings for capture:")
settings_2d = zivid.Settings2D()

settings_2d.sampling.color = zivid.Settings2D.Sampling.Color.rgb
settings_2d.sampling.pixel = zivid.Settings2D.Sampling.Pixel.all

settings_2d.processing.color.balance.red = 1.0
settings_2d.processing.color.balance.blue = 1.0
settings_2d.processing.color.balance.green = 1.0
settings_2d.processing.color.gamma = 1.0

settings_2d.processing.color.experimental.mode = zivid.Settings2D.Processing.Color.Experimental.Mode.automatic

settings = zivid.Settings()
settings.engine = zivid.Settings.Engine.phase

settings.region_of_interest.box.enabled = True
settings.region_of_interest.box.point_o = [1000, 1000, 1000]
settings.region_of_interest.box.point_a = [1000, -1000, 1000]
settings.region_of_interest.box.point_b = [-1000, 1000, 1000]
settings.region_of_interest.box.extents = [-1000, 1000]

settings.region_of_interest.depth.enabled = True
settings.region_of_interest.depth.range = [200, 2000]

settings.processing.filters.cluster.removal.enabled = True
settings.processing.filters.cluster.removal.max_neighbor_distance = 10
settings.processing.filters.cluster.removal.min_area = 100

settings.processing.filters.hole.repair.enabled = True
settings.processing.filters.hole.repair.hole_size = 0.2
settings.processing.filters.hole.repair.strictness = 1

settings.processing.filters.noise.removal.enabled = True
settings.processing.filters.noise.removal.threshold = 7.0

settings.processing.filters.noise.suppression.enabled = True
settings.processing.filters.noise.repair.enabled = True

settings.processing.filters.outlier.removal.enabled = True
settings.processing.filters.outlier.removal.threshold = 5.0

settings.processing.filters.reflection.removal.enabled = True
settings.processing.filters.reflection.removal.mode = (
    zivid.Settings.Processing.Filters.Reflection.Removal.Mode.global_
)

settings.processing.filters.smoothing.gaussian.enabled = True
settings.processing.filters.smoothing.gaussian.sigma = 1.5

settings.processing.filters.experimental.contrast_distortion.correction.enabled = True
settings.processing.filters.experimental.contrast_distortion.correction.strength = 0.4

settings.processing.filters.experimental.contrast_distortion.removal.enabled = False
settings.processing.filters.experimental.contrast_distortion.removal.threshold = 0.5

settings.processing.resampling.mode = zivid.Settings.Processing.Resampling.Mode.upsample2x2

settings.diagnostics.enabled = False

settings.color = settings_2d

_set_sampling_pixel(settings, camera)
print(settings)
print("Configuring acquisition settings different for all HDR acquisitions")
exposure_values = _get_exposure_values(camera)
for aperture, gain, exposure_time, brightness in exposure_values:
    settings.acquisitions.append(
        zivid.Settings.Acquisition(
            aperture=aperture,
            exposure_time=exposure_time,
            brightness=brightness,
            gain=gain,
        )
    )

settings_2d.acquisitions.append(
    zivid.Settings2D.Acquisition(
        aperture=2.83,
        exposure_time=timedelta(microseconds=10000),
        brightness=1.8,
        gain=1.0,
    )
)

Capture 2D3D

Now we can capture a 2D and 3D image (point cloud with color). Whether there is a single acquisition or multiple acquisitions (HDR) is given by the number of acquisitions in settings.

소스로 이동

source

const auto frame = camera.capture2D3D(settings);
소스로 이동

source

using (var frame = camera.Capture2D3D(settings))
소스로 이동

source

with camera.capture_2d_3d(settings) as frame:

The Zivid::Frame contains the point cloud, the color image, the capture, and the camera information (all of which are stored on the compute device memory).

The Zivid.NET.Frame contains the point cloud, the color image, the capture, and the camera information (all of which are stored on the compute device memory).

The zivid.Frame contains the point cloud, the color image, the capture, and the camera information (all of which are stored on the compute device memory).

Capture 3D

If we only want to capture 3D, the points cloud without color, we can do so via the capture3D API.

소스로 이동

source

const auto frame3D = camera.capture3D(settings);
소스로 이동

source

using (var frame3D = camera.Capture3D(settings))
소스로 이동

source

with camera.capture_3d(settings) as frame_3d:

Capture 2D

If we only want to capture a 2D image, which is faster than 3D, we can do so via the capture2D API.

소스로 이동

source

const auto frame2D = camera.capture2D(settings);
소스로 이동

source

using (var frame2D = camera.Capture2D(settings))
소스로 이동

source

with camera.capture_2d(settings) as frame_2d:

Save

이제 결과를 저장할 수 있습니다.

소스로 이동

source

const auto dataFile = "Frame.zdf";
frame.save(dataFile);
소스로 이동

source

var dataFile = "Frame.zdf";
frame.Save(dataFile);
소스로 이동

소스

data_file = "Frame.zdf"
frame.save(data_file)

Zivid Studio 에서 Frame.zdf 파일을 열고 볼 수 있습니다.

Export

In the next code example, the point cloud is exported to the .ply format. For other exporting options, see Point Cloud for a list of supported formats.

소스로 이동

source

const auto dataFilePLY = "PointCloud.ply";
frame.save(dataFilePLY);
소스로 이동

source

var dataFilePLY = "PointCloud.ply";
frame.Save(dataFilePLY);
소스로 이동

소스

data_file_ply = "PointCloud.ply"
frame.save(data_file_ply)

Load

프레임을 저장하면 ZDF 파일에서 로드할 수 있습니다.

소스로 이동

소스

const auto dataFile = std::string(ZIVID_SAMPLE_DATA_DIR) + "/Zivid3D.zdf";
std::cout << "Reading ZDF frame from file: " << dataFile << std::endl;
const auto frame = Zivid::Frame(dataFile);
소스로 이동

소스

var dataFile =
    Environment.GetFolderPath(Environment.SpecialFolder.CommonApplicationData) + "/Zivid/Zivid3D.zdf";
Console.WriteLine("Reading ZDF frame from file: " + dataFile);
var frame = new Zivid.NET.Frame(dataFile);
소스로 이동

소스

data_file = get_sample_data_path() / "Zivid3D.zdf"
print(f"Reading point cloud from file: {data_file}")
frame = zivid.Frame(data_file)

Save 2D

We can get the 2D color image from Frame2D, which is part of the Frame object, obtained from capture2D3D().

const auto image2D = frame.frame2D().value().imageBGRA();
var image2D = frame.Frame2D.ImageBGRA();
image_2d = frame.frame_2d().image_bgra()

We can get 2D color image directly from the point cloud. This image will have the same resolution as the point cloud.

const auto pointCloud = frame.pointCloud();
const auto image2DInPointCloudResolution = pointCloud.copyImageRGBA();
var pointCloud = frame.PointCloud;
var image2DInPointCloudResolution = pointCloud.CopyImageRGBA();
point_cloud = frame.point_cloud()
image_2d_in_point_cloud_resolution = point_cloud.copy_image("bgra")

2D captures also produce 2D color images in linear RGB and sRGB color space.

소스로 이동

source

const auto imageRGBA = frame.frame2D().value().imageRGBA();
소스로 이동

source

var imageRGBA = frame.Frame2D.ImageRGBA();
소스로 이동

source

image_bgra = frame.frame_2d().image_rgba()

소스로 이동

source

const auto imageSRGB = frame2D.imageSRGB();
소스로 이동

source

var imageSRGB = frame2D.ImageSRGB();
소스로 이동

source

image_srgb = frame_2d.image_srgb()

Then, we can save the 2D image in linear RGB or sRGB color space.

소스로 이동

source

const auto imageFile = "ImageRGB.png";
std::cout << "Saving 2D color image (linear RGB color space) to file: " << imageFile << std::endl;
imageRGBA.save(imageFile);
소스로 이동

source

var imageFile = "ImageRGB.png";
Console.WriteLine("Saving 2D color image (linear RGB color space) to file: " + imageFile);
imageRGBA.Save(imageFile);
소스로 이동

source

image_file = "ImageRGBA.png"
print(f"Saving 2D color image (linear RGB color space) to file: {image_file}")
image_bgra.save(image_file)

소스로 이동

source

const auto imageFile = "ImageSRGB.png";
std::cout << "Saving 2D color image (sRGB color space) to file: " << imageFile << std::endl;
imageSRGB.save(imageFile);
소스로 이동

source

var imageFile = "ImageSRGB.png";
Console.WriteLine("Saving 2D color image (sRGB color space) to file: " + imageFile);
imageSRGB.Save(imageFile);
소스로 이동

source

image_file = "ImageSRGB.png"
print(f"Saving 2D color image (sRGB color space) to file: {image_file}")
image_srgb.save(image_file)

Multithreading

카메라 개체에 대한 작업은 스레드로부터 안전하지만 카메라 나열 및 카메라 연결과 같은 다른 작업은 순서대로 실행해야 합니다. 자세한 내용은 Multithreading 알아보십시오.

Conclusion

이 튜토리얼은 Zivid SDK를 사용하여 Zivid 카메라에 연결, 구성, 캡처 및 저장하는 방법을 보여줍니다.