Code Monkey home page Code Monkey logo

orbbecsdk's Introduction

Orbbec SDK

stability version

The Orbbec 3D camera product software development kit fully supports UVC, enabling driver-free plug-and-play. It provides both low-level and high-level APIs that are simple and easy to use, allowing developers to use it flexibly in different scenarios.

Additionally, this SDK is compatible with Orbbec's original OpenNI protocol devices through built-in code, enabling developers to migrate to Orbbec SDK to support both new and old products with one set of code.

If you are a user in China, it is recommended to use Gitee(gitee Repo).

What is included in the repository

  • library : Orbbec SDK core library files and C/C++ header files.
  • examples : C/C++ samples project source code.
  • doc : API reference documentation and sample documentation.
  • driver : Windows device driver for OpenNI protocol devices (Dabai, Dabai DCW, Dabai DW, Astra mini Pro, Astra Pro Plus, A1 Pro, Gemini E, Gemini E Lite, Gemini). While modules that use the standard UVC protocol do not need to install drivers.
  • scripts : Linux udev rules for resolving permission issues and Windows timestamp registration scripts for resolving timestamp and metadata issues.

license structure

The current software license structure is as follows

SdkLicenseDiagram

Platform support

Operating system Requirement Description
Windows - Windows 10 April 2018 (version 1803, operating system build 17134) release (x64) or higher The generation of the VS project depends on the installation of the VS version and the cmake version, and supports VS2015/vs2017/vs2019
Linux - Linux Ubuntu 16.04/18.04/20.04 (x64) Support GCC 7.5
Arm32 - Linux Ubuntu 16.04/18.04/20.04 Support GCC 7.5
Arm64 - Linux Ubuntu 18.04/20.04 Support GCC 7.5
MacOS - M series chip, 11.0 and above; Intel x86 chip, 10.15 and above. supported hardware products: Gemini 2, Gemini 2 L, Astra 2, Gemini 2 XL, Femto Mega, G300 series
  • Note: supported Arm platforms: jestson nano (arm64)、 AGX Orin(arm64)、Orin NX (arm64)、Orin Nano(arm64)、A311D (arm64), Raspberry Pi 4 (arm64), Raspberry Pi 3 (arm32), rk3399 (arm64), other Arm systems, may need to Cross-compile.
  • Windows 11, Ubuntu 22.04 and other Linux platforms may also be supported, but have not been fully tested.

Product support

Products List Firmware Version
Gemini 335 1.2.20
Gemini 335L 1.2.20
Femto Bolt 1.0.6/1.0.9
Femto Mega 1.1.7/1.2.7
Gemini 2 XL Obox: V1.2.5 VL:1.4.54
Astra 2 2.8.20
Gemini 2 L 1.4.32
Gemini 2 1.4.60 /1.4.76
Astra+ 1.0.22/1.0.21/1.0.20/1.0.19
Femto 1.6.7
Femto W 1.1.8
DaBai 2436
DaBai DCW 2460
DaBai DW 2606
Astra Mini Pro 1007
Gemini E 3460
Gemini E Lite 3606
Gemini 3.0.18
Astra Mini S Pro 1.0.05

OrbbecViewer

OrbbecViewer is a useful tool based on Orbbec SDK, that can be used to view the data stream from the Orbbec camera and control the camera. OrbbecViewer

Supported platforms: Windows x64, Linux x64 & ARM64, MacOS M series chip & Intel x86 chip

Download link: Releases

OrbbecViewer User Manual: OrbbecViewer User Manual

Getting started

Get source code

git clone https://github.com/orbbec/OrbbecSDK.git

Environment setup

  • Linux:

    Install udev rules file

    cd OrbbecSDK/misc/scripts
    sudo chmod +x ./install_udev_rules.sh
    sudo ./install_udev_rules.sh
    sudo udevadm control --reload && sudo udevadm trigger
  • Windows:

    Timestamp registration: follow this: obsensor_metadata_win10

  • For more information, please refer to:Environment Configuration

Examples

The sample code is located in the ./examples directory and can be built using CMake.

Build

cd OrbbecSDK && mkdir build && cd build && cmake .. && cmake --build . --config Release

Run example

To connect your Orbbec camera to your PC, run the following steps:

cd OrbbecSDK/build/bin # build output dir
./OBMultiStream  # OBMultiStream.exe on Windows

Notes: On MacOS, sudo privileges are required.

# MacOS
cd OrbbecSDK/build/bin # build output dir
sudo ./OBMultiStream

The following image is the result of running MultiStream on the Gemini2 device. Other Devices run result maybe different.

Multistream

Notes: On the Linux/Arm platform ,this sample requires users to compile with Opencv4.2 or above,otherwise, it cannot be rendered.

Use Orbbec SDK in your CMake project

Find and link Orbbec SDK in your CMakeLists.txt file like this:

cmake_minimum_required(VERSION 3.1.15)
project(OrbbecSDKTest)

add_executable(${PROJECT_NAME} main.cpp)

# find Orbbec SDK
set(OrbbecSDK_DIR "/your/path/to/OrbbecSDK")
find_package(OrbbecSDK REQUIRED)

# link Orbbec SDK
target_link_libraries(${PROJECT_NAME} OrbbecSDK::OrbbecSDK)

Documents

Related links

orbbecsdk's People

Contributors

3ddaiwei avatar cool-wuzh avatar hzcyf avatar jian-dong avatar nan-orbbec3d-us avatar obwh avatar zhonghong322 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

orbbecsdk's Issues

Femto Bolt: Crash after "got nvram data failed.retrying..."

Hi, with a Femto Bolt device I am getting an issue that it sometimes fails to open. The issue can be reproduced with OrbbecViewer as follows:

  • Open OrbbecViewer with the device connected
  • Start the color and depth streams
  • Close the OrbbecViewer window
  • Repeat these three steps until the issue occurs

Roughly every 3rd time, OrbbecViewer will fail to open, with the following log output:

2024-03-23 13:14:22.156 INFO  [11368] [loggerInit@18] **********************************************
2024-03-23 13:14:22.156 INFO  [11368] [loggerInit@19]  OrbbecViewer launched! Welcome!! 
2024-03-23 13:14:22.156 INFO  [11368] [loggerInit@20]  	- Version: V1.9.5
2024-03-23 13:14:22.156 INFO  [11368] [loggerInit@21]  	- Author: 
2024-03-23 13:14:22.156 INFO  [11368] [loggerInit@22]  	- E-Mail: 
2024-03-23 13:14:22.156 INFO  [11368] [loggerInit@23]  	- Company: orbbec
2024-03-23 13:14:22.156 INFO  [11368] [loggerInit@24]  	- Website: http://www.orbbec.com.cn/
2024-03-23 13:14:22.156 INFO  [11368] [loggerInit@25] **********************************************
[03/23 13:14:22.285349][info][11368][Context.cpp:69] Context created with config: /home/thomas/Downloads/OrbbecViewer_v1.9.5_202403050503_linux_x64_release/OrbbecSDKConfig_v1.0.xml
[03/23 13:14:22.285362][info][11368][Context.cpp:74] Context work_dir=/home/thomas/Downloads/OrbbecViewer_v1.9.5_202403050503_linux_x64_release
[03/23 13:14:22.285368][info][11368][Context.cpp:77] 	- SDK version: 1.9.5
[03/23 13:14:22.285373][info][11368][Context.cpp:78] 	- SDK stage version: main
[03/23 13:14:22.285381][info][11368][Context.cpp:82] get config EnumerateNetDevice:false
[03/23 13:14:22.285390][info][11368][LinuxPal.cpp:38] createObPal: create LinuxPal!
[03/23 13:14:22.383854][info][11368][LinuxPal.cpp:112] Create PollingDeviceWatcher!
[03/23 13:14:22.383884][info][11368][DeviceManager.cpp:15] Current found device(s): (1)
[03/23 13:14:22.383890][info][11368][DeviceManager.cpp:24] 	- Name: Femto Bolt, PID: 0x066b, SN/ID: CL8FC3100E2, Connection: USB3.1
[03/23 13:14:22.383916][info][11368][DeviceManager.cpp:310] Enable net device enumeration: true
[03/23 13:14:22.384039][info][11368][GVCPClient.cpp:223] bind 0.0.0.0:0
[03/23 13:14:23.385369][info][11368][DeviceManager.cpp:15] Current device(s) list: (1)
[03/23 13:14:23.385388][info][11368][DeviceManager.cpp:24] 	- Name: Femto Bolt, PID: 0x066b, SN/ID: CL8FC3100E2, Connection: USB3.1
[03/23 13:14:23.385451][info][11368][FemtoBoltUvcDevice.cpp:23] FemtoBoltUvcDevice init ...
[03/23 13:14:23.385505][info][11368][FemtoBoltUvcDevice.cpp:120] Create command start!
[03/23 13:14:23.386569][info][11368][MSDEConverterDevice.cpp:721] Succeed to load depth engine plugin
[03/23 13:14:23.592557][info][11368][FemtoBoltUvcDevice.cpp:271] Create command done!
[03/23 13:14:23.592576][info][11368][FemtoBoltUvcDevice.cpp:431] init sensor map start!
[03/23 13:14:23.592589][info][11368][FemtoBoltUvcDevice.cpp:458] init sensor map done!
[03/23 13:14:23.593259][info][11368][AbstractDevice.cpp:124] 	- Firmware version: 1.1.0
[03/23 13:14:23.593308][info][11368][FemtoBoltUvcDevice.cpp:275] Init depth process param start!
[03/23 13:14:23.934336][info][11368][MSDEConverterDevice.cpp:781] got nvram data failed.retrying...

[The "got nvram data failed.retrying..." line repeats some 10 thousand times ...]

[03/23 13:14:28.428688][info][11368][MSDEConverterDevice.cpp:781] got nvram data failed.retrying...
[03/23 13:14:28.428746][info][11368][MSDEConverterDevice.cpp:772] got nvram data succeed.
[03/23 13:14:28.428767][info][11368][ObUvcDevice.cpp:112] endpoint:130
[03/23 13:14:28.428828][info][11368][ObUvcDevice.cpp:118] libusb_clear_halt done, endpoint:130
munmap_chunk(): invalid pointer

This issue also happens when allocating an ob::Device in my application that uses the Orbbec SDK.

I am on Linux, with the Orbbec SDK V1.9.5, and firmware version 1.1.0 for the Femto Bolt.
Edit: In case it matters, the camera is connected to the PC with the original USB cable.

Femto Mega 3840 x 2160 resolution on ethernet.

We are trying to use the Femto mega using the Ethernet interface at 3840 by 2160 resolution for color. When using the USB C interface for the camera I am able to get the full resolution of 3840 by 2160 using the MJPG format. However when I try to do this with the Ethernet Interface it doesn't allow me to use that format and forces me to use the H264 format, however the orbbec sdk doesn't seem to support this.

Is there any plan to make this work? We would really like to use the ethernet interface vs the USB to save bandwidth.

Hardware trigger model for Femto Mega

Hello, I m trying to setup a multi-view system and get the hardware synchronized streams.
The idea is to have 1 device set in OB_MULTI_DEVICE_SYNC_MODE_PRIMARY and the rest of them in the OB_MULTI_DEVICE_SYNC_MODE_HARDWARE_TRIGGERING mode, but I get an error when trying to set up the latter.

What would be the correct way to go about it?

Frame index abnormal warnings on Femto Mega with ethernet

According to the warnings, there are dropped frames ( example: #206 [WARN] Frame index abnormal, prevFrameIndex=89557, curFrameIndex=89559) when used over ethernet. As far as I can see it, no bottlenecks are present, the OrbberViewer utilizes the CPU ~50% of a single core, the bandwidth is ~27Mbyte/s after a gigabit handshake, so pretty far from being bottleneck. No ethernet messages in dmesg, so I assume on hardware level it works as it should.

Is this some false error report? Or should I play around priorities / formats / etc?

Femto Bolt undocumented 3840x2160@30fps mode?

According to the Femto Bolt product page, and in the data sheet, it says the color camera supports "Up to 3840x2160@25fps"

However, I noticed that when querying all supported video profiles using the SDK, I also see 3840x2160@30fps modes in addition to 25fps.

I've tested the 3840x2160@30fps BGRA format mode, and it appears to work correctly. I see a time delta between frames of ~33ms, and the resolution is indeed 2160p.

This is an welcome surprise, but is it too good to be true? Is there any reason not to use this 30fps mode? Is there something happening in software to "upscale" the 25fps stream to 30fps, or is it a true 30fps?

Femto Mega color to depth transformation matrix has numerical issues

Ubuntu 20.04 LTS, OrbbecSDK 1.9.4, Femto Mega Firmware 1.2.8

Here is minimal example (mainly just matrix util stuff which you can ignore):

#include <array>
#include <iostream>
#include "utils.hpp"
#include "libobsensor/ObSensor.hpp"
#include "libobsensor/hpp/Error.hpp"

constexpr double EPSILON = 1e-4;

/** ------------- Matrix utils, feel free to replace with Eigen --------------- */
using Matrix3d = std::array<std::array<double, 3>, 3>;

constexpr Matrix3d IDENTITY_3x3 = {{
    { 1, 0, 0 },
    { 0, 1, 0 },
    { 0, 0, 1 }
}};

Matrix3d getRotationMatrix(const OBTransform &extrinsic) {
    auto &R = extrinsic.rot;
    return {{
        { R[0], R[1], R[2] },
        { R[3], R[4], R[5] },
        { R[6], R[7], R[8] },
    }};
}

Matrix3d matrixMultiply(const Matrix3d &a, const Matrix3d &b) {
    Matrix3d r;
    for (size_t i = 0; i < 3; ++i) {
        for (size_t j = 0; j < 3; ++j) {
            r.at(i).at(j) = 0;
            for (size_t k = 0; k < 3; ++k) {
                r.at(i).at(j) += a.at(i).at(k) * b.at(k).at(j);
            }
        }
    }
    return r;
}

Matrix3d matrixMinus(const Matrix3d &a, const Matrix3d &b) {
    Matrix3d r;
    for (size_t i = 0; i < 3; ++i) {
        for (size_t j = 0; j < 3; ++j) {
            r.at(i).at(j) = a.at(i).at(j) - b.at(i).at(j);
        }
    }
    return r;
}

Matrix3d matrixTranspose(const Matrix3d &m) {
    Matrix3d r;
    for (size_t i = 0; i < 3; ++i) {
        for (size_t j = 0; j < 3; ++j) {
            r.at(i).at(j) = m.at(j).at(i);
        }
    }
    return r;
}

// Frobenius norm, same as Eigen's matrix.norm()
double matrixNorm(const Matrix3d &m) {
    double r = 0;
    for (size_t i = 0; i < 3; ++i) {
        for (size_t j = 0; j < 3; ++j) {
            r += m.at(i).at(j) * m.at(i).at(j);
        }
    }
    return r;
}

/** ------------- Matrix utils, feel free to replace with Eigen --------------- */

void checkRotationMatrix(const Matrix3d &R) {
    // Check that R * R^T = I
    const Matrix3d Rt = matrixTranspose(R);
    const Matrix3d RRt = matrixMultiply(R, Rt);
    const Matrix3d zero = matrixMinus(IDENTITY_3x3, RRt);
    const double norm = matrixNorm(zero);
    std::cout << "norm(I - R*R^T) = " << norm << std::endl;

    if (norm > EPSILON) {
        std::cerr << "Invalid rotation matrix: I - R * R^T = " << norm << " > " << EPSILON << std::endl;
    }
}

int main(int argc, char **argv) {
    ob::Context::setLoggerSeverity(OB_LOG_SEVERITY_OFF);

     // Create a pipeline with default device
    ob::Pipeline pipe;

    // Configure which streams to enable or disable for the Pipeline by creating a Config
    std::shared_ptr<ob::Config> config = std::make_shared<ob::Config>();

    std::shared_ptr<ob::VideoStreamProfile> colorProfile = nullptr;
    try {
        auto profiles = pipe.getStreamProfileList(OB_SENSOR_COLOR);
        try {
            colorProfile = profiles->getVideoStreamProfile(1280, OB_HEIGHT_ANY, OB_FORMAT_RGB, 30);
        }
        catch(ob::Error &e) {
            colorProfile = std::const_pointer_cast<ob::StreamProfile>(profiles->getProfile(OB_PROFILE_DEFAULT))->as<ob::VideoStreamProfile>();
        }
        config->enableStream(colorProfile);
    }
    catch(ob::Error &e) {
        exit(EXIT_FAILURE);
    }

    std::shared_ptr<ob::VideoStreamProfile> depthProfile = nullptr;
    try {
        auto profiles = pipe.getStreamProfileList(OB_SENSOR_DEPTH);
        try {
            depthProfile = profiles->getVideoStreamProfile(OB_WIDTH_ANY, OB_HEIGHT_ANY);
        }
        catch(ob::Error &e) {
            depthProfile = std::const_pointer_cast<ob::StreamProfile>(profiles->getProfile(OB_PROFILE_DEFAULT))->as<ob::VideoStreamProfile>();
        }
        config->enableStream(depthProfile);
    }
    catch(ob::Error &e) {
        exit(EXIT_FAILURE);
    }
    pipe.switchConfig(config);

    const OBCameraParam &calibration = pipe.getCameraParam();
    checkRotationMatrix(getRotationMatrix(calibration.transform));

    // Same issue here...
    auto param = pipe.getCalibrationParam(config);
    OBTransform colorToDepth = param.extrinsics[OB_SENSOR_COLOR][OB_SENSOR_DEPTH];
    checkRotationMatrix(getRotationMatrix(colorToDepth));

    return 0;
}

With Femto Mega I get

norm(I - R*R^T) = 0.000110361
Invalid rotation matrix: I - R * R^T = 0.000110361 > 0.0001

Astra 2 gives:

norm(I - R*R^T) = 4.50559e-16

The Femto Mega rotation matrix causes numerical issues for us.

Femto Bolt: timestamps always zero until using Orbbec Viewer application.

I just got started using the Orbbec SDK with the Femto Bolt.
With brand new sensors, I've noticed that timestamps are always reported as zero in in my program (and also in SDK examples that report timestamps such as the HotPlugin example).

However, when testing the devices using the OrbbecViewer application, I see that timestamps are reported, and then after using a device that has previously been opened using the OrbbecViewer with my own application, timestamps are now working. I have reproduced this behavior on two separate sensors, and the third sensor I am intentionally leaving in the broken state, not using it with the Orbbec Viewer so that I can hopefully fix the issue within my own application.

I'm hoping the Orbbec team can reproduce this issue, and provide guidance for how to initialize the sensor properly such that timestamps are not reported as zero. If the source code for the OrbbecViewer were available I could probably figure this out myself too, but I understand if there are reasons not to open up that codebase.

hang on exit - 1.8.1 sdk

Experiencing this on Linux and Mac.

If I exit my app while the camera ( Femto Mega ) is running I get a hang in one of the non main threads at pthread_kill and the following message printed out:
libc++abi: terminating due to uncaught exception of type std::__1::system_error: mutex lock failed: Invalid argument

The main thread while not crashed is stuck here:

    try{
		if( mPipe ){	
			mPipe->stop();  <------- 
			mPipe.reset();
			pointCloud.reset();
                }
    }catch(ob::Error &e) {
        std::cerr << "function:" << e.getName() << "\nargs:" << e.getArgs() << "\nmessage:" << e.getMessage() << "\ntype:" << e.getExceptionType() << std::endl;
    }

Is the OrbbecSDK listening for signals and terminating threads in some weird way?
As this only seems to happen on my app's exit.

If I delete my camera object while my app is running I get zero issues, it is only if I try and close and delete my camera on exit that this issue seems to occur.

Maybe this is still unresolved from #12 ?

The only useful log info seems to be:

[*** LOG ERROR #0001 ***] [2023-11-29 11:30:35] [OrbbecSDK] {mutex lock failed: Invalid argument}
libc++abi: terminating due to uncaught exception of type std::__1::system_error: mutex lock failed: Invalid argument

Femto Bolt sensors frequently appearing connected as USB2.1 while using USB 3.1 ports

On Windows, after rebooting sensors, sometimes (quite often in fact) they will show up with USB2.1 connections when they become reconnected. This requires physically disconnecting and reconnecting the sensors, power cycling, or sometimes even rebooting the host PC in order to resolve the issue.

The issue occurs with both v1.0.6 and v1.0.9 firmware

is it normal that Orbbec Bolt reports depth intrinsic fx the same value regardless of mode?

I wrote the following code:

    auto paramlist = dev->getCalibrationCameraParamList();
    for (int i=0;i<paramlist->count();i++) {
        OBCameraParam param = paramlist->getCameraParam(i);
        std::cout << param.rgbDistortion.k1 << " " <<param.rgbDistortion.k2 << " " <<param.rgbDistortion.k3 << " " <<param.rgbDistortion.k4 << " " <<param.rgbDistortion.k5 << " "<<param.rgbDistortion.k6 << " " << std::endl;
        std::cout << param.rgbIntrinsic.fx << " " <<param.rgbIntrinsic.fy << " " <<param.rgbIntrinsic.cx << " " <<param.rgbIntrinsic.cy << " " <<param.rgbIntrinsic.width << " " <<param.rgbIntrinsic.height << " " <<  std::endl;
        std::cout << param.depthDistortion.k1 << " " <<param.depthDistortion.k2 << " " <<param.depthDistortion.k3 << " " <<param.depthDistortion.k4 << " " <<param.depthDistortion.k5 << " "<<param.depthDistortion.k6 << " " <<std::endl;
        std::cout << param.depthIntrinsic.fx << " " <<param.depthIntrinsic.fy << " " <<param.depthIntrinsic.cx << " " <<param.depthIntrinsic.cy << " " <<param.depthIntrinsic.width << " " <<param.depthIntrinsic.height << " " <<  std::endl;
    }

and the output is the following:

0.0784412 -0.110434 0.0465312 0 0 0
373.929 373.583 317.359 186.263 640 360
8.68957 4.46817 0.172922 9.01699 7.37854 1.05598
505.284 505.233 530.666 505.513 1024 1024
0.0784412 -0.110434 0.0465312 0 0 0
498.572 498.111 316.479 248.351 640 480
8.68957 4.46817 0.172922 9.01699 7.37854 1.05598
505.284 505.233 530.666 505.513 1024 1024
0.0784412 -0.110434 0.0465312 0 0 0
373.929 373.583 317.359 186.263 640 360
8.68957 4.46817 0.172922 9.01699 7.37854 1.05598
505.284 505.233 338.666 325.513 640 576
0.0784412 -0.110434 0.0465312 0 0 0
498.572 498.111 316.479 248.351 640 480
8.68957 4.46817 0.172922 9.01699 7.37854 1.05598
505.284 505.233 338.666 325.513 640 576

I noticed that the depth fx and fy is exactly the same for 1024x1024 and 640x576 modes, which is odd.

macOS - uvc_open already opened

Hi,

I just received a Femto Mega and was trying out some of the samples from the SDK.

For the DepthViewer.cpp the camera is seen but it seems to hang on getVideoStreamProfile.

[2023-09-19 19:33:37.436493][info][37890418][Context.cpp:66] Context created with config: default config!
[2023-09-19 19:33:37.447971][info][37890418][DeviceManager.cpp:562] Found 1 device(s):
[2023-09-19 19:33:37.447983][info][37890418][DeviceManager.cpp:564]   - Name: Femto Mega, PID: 0x0669, SN/ID: CL2K83P001W
[2023-09-19 19:33:37.448009][info][37890418][MacPal.cpp:103] Create PollingDeviceWatcher!
[2023-09-19 19:33:37.448037][info][37890418][Pipeline.cpp:15] Try to create pipeline with default device.
[2023-09-19 19:33:52.262518][warning][37890418][ObUvcDevice.cpp:51] uvc_open  path=2-2-1.2 already opened
[2023-09-19 19:33:52.262867][warning][37890418][ObUvcDevice.cpp:54] uvc_open  path=2-2-1.2 failed,return res:$-3
[2023-09-19 19:33:52.264594][info][37891082][DeviceManager.cpp:117] task finish.
[2023-09-19 19:33:52.264734][info][37890418][Context.cpp:82] Context destroyed
function:Pipeline
args:nullptr
message:uvc_open  path=2-2-1.2 failed,return res-3
type:0
Program ended with exit code: 1

This is on a M1 MacBook Pro with MacOs 13.4.1
It seems pretty close but the device has not already been opened.

Happens no matter what port I use and even if I power cycle the camera.

Thanks!
Theo

OrbbecViewer camera settings UI occludes firmware update field/button when multiple sensors are connected to the host computer

When using Femto Bolt sensors with Depthkit Studio, it is necessary to have multiple sensors simultaneously connected to one host PC. I ran into an issue while trying to update the firmware on ten Femto Bolts where the firmware update elements in the OrbbecViewer interface are inaccessible, requiring the user to physically disconnect all but one sensor at a time, slowing down the update process considerably. To reproduce:

  • Connect multiple Femto Bolts to the same host PC
  • Open OrbbecViewer
  • Place one of the sensors in Recovery Mode to update firmware
  • Observe that while OrbbecViewer does detect that sensor in Recovery Mode, a different sensor becomes active in the Camera Settings interface, which occludes the firmware path field and update buttons (see attached screenshot)
    image
  • The firmware path field and update buttons are not accessible until all sensors except for the one sensor which is in Recovery Mode are physically disconnected from the PC.

This requires access to the physical USB connection of each sensor, which makes firmware updating more challenging and slower, compared to the Azure Kinect Firmware Tool which allows all of the sensors connected to the PC to be updated in automatic succession with one command-line command.

1.7.x SDK / libs for macOS

I have been working on an ofxOrbbec addon for https://github.com/openframeworks/openFrameworks
It is working now on mac / windows and linux.

But currently I am having to include the 1.6 dylibs for macOS alongside the 1.7.x libs for Windows and Linux.

There aren't any issues using the 1.6 libs with the include from 1.7 but I do get the hang on exit issue from #12

Is it possible to release macOS libs alongside Linux / Window for new releases?

Going to order an Astra 2 so I can test on mac properly but this issue is the only thing stopping us using Orbbec SDK for our Mac projects.

Harware sync device getting into fault state

Hello Im trying to intergrate HW sync into my app and I reproduced the code under examples/MultiDeviceSync as closely as the architecture of my app allows me and its pretty close.
What I get after I start the devices is the secondary device getting into fault state (orange blinking) and as I see, I dont get Depth and IR frame, I get color though.
Logging doesnt show any errors or warnings.
The key differnce with the examples is that I have an object per device, and they dont share ob::Context, each object creates its own. Apart from that, I followed the examples pretty close.
Any clue about that?

Femto Mega cameras and IMU have inconsistent device timestamps

Hey, ob::ColorFrame::timeStampUs() and ob::DepthFrame::timestampUs() return this type of timestamps 1623497648.304, whereas AccelFrame/GyroFrame::timestampUs() start from 0 when the device is started (i.e. they are using different clock/are inconsistent). In addition, the timestamps seem to have 1 millisecond resolution (i.e. all timestamps seem to be rounded to closest millisecond). I also tested Astra 2 and it did not have this issue.

Some additional questions:

  1. Does the image device timestamp correspond to start/middle/end of image exposure? We would like to get middle point of exposure
  2. Is there way to get ob::ColorFrame exposure time? (not so critical if timestamp is middle point of exposure).
  3. IMU samples are batched (i.e. ~10 gyro or accel samples arrive at once). Ideally we would disable this behaviour (for instance OAK-D devices have option to set IMU batch size). This also happens on Astra 2.

Here is an example of the IMU & RGB-D data I have serialized in case it is helpful:

"sensor":{"type":"gyroscope","values":[0.002128450432792306,-0.015963377431035042,0.003192675532773137]},"time":10.658999999999999}
{"sensor":{"type":"gyroscope","values":[0.006385351065546274,-0.018091827630996704,0.002128450432792306]},"time":10.66}
{"sensor":{"type":"gyroscope","values":[0.002128450432792306,-0.015963377431035042,0.002128450432792306]},"time":10.661}
{"sensor":{"type":"gyroscope","values":[0.004256900865584612,-0.01915605366230011,0.001064225216396153]},"time":10.661999999999999}
{"sensor":{"type":"gyroscope","values":[0.002128450432792306,-0.015963377431035042,0.004256900865584612]},"time":10.663}
{"sensor":{"type":"gyroscope","values":[0.004256900865584612,-0.015963377431035042,0.003192675532773137]},"time":10.664}
{"sensor":{"type":"gyroscope","values":[0.006385351065546274,-0.017027603462338448,0.002128450432792306]},"time":10.665}
{"sensor":{"type":"gyroscope","values":[0.004256900865584612,-0.018091827630996704,0.003192675532773137]},"time":10.666}
{"sensor":{"type":"gyroscope","values":[0.005321125965565443,-0.015963377431035042,0.005321125965565443]},"time":10.667}
{"sensor":{"type":"gyroscope","values":[0.004256900865584612,-0.01915605366230011,0.005321125965565443]},"time":10.668}
{"sensor":{"type":"accelerometer","values":[0.2679687440395355,0.00957031175494194,10.048828125]},"time":10.658999999999999}
{"sensor":{"type":"accelerometer","values":[0.21054686605930328,-0.00478515587747097,10.020116806030273]},"time":10.66}
{"sensor":{"type":"accelerometer","values":[0.24404296278953552,-0.03828124701976776,9.98183536529541]},"time":10.661}
{"sensor":{"type":"accelerometer","values":[0.24404296278953552,-0.00478515587747097,10.0009765625]},"time":10.661999999999999}
{"sensor":{"type":"accelerometer","values":[0.22490233182907104,0.0,10.015331268310547]},"time":10.663}
{"sensor":{"type":"accelerometer","values":[0.22490233182907104,0.00957031175494194,10.0009765625]},"time":10.664}
{"sensor":{"type":"accelerometer","values":[0.24882811307907104,0.014355468563735485,10.010546684265137]},"time":10.665}
{"sensor":{"type":"accelerometer","values":[0.24404296278953552,0.00478515587747097,9.98183536529541]},"time":10.666}
{"sensor":{"type":"accelerometer","values":[0.24882811307907104,0.0,9.996191024780273]},"time":10.667}
{"sensor":{"type":"accelerometer","values":[0.2679687440395355,0.0,10.00576114654541]},"time":10.668}
{"frames":[{"cameraInd":0,"colorFormat":"rgb","height":720,"type":"depth","width":1280},{"aligned":false,"cameraInd":1,"colorFormat":"gray16","depthScale":0.001,"features":[],"hasDepth":true,"height":576,"number":1,"type":"depth","width":640}],"number":1,"time":1623497648.304}

OrbbecSDK and librealsense conflicting

I've discovered that when using the OrbbecSDK and librealsense in the same project, I encounter some unhandled exceptions in the following cases:

  • If I start my application without any sensors connected. Starting the application with sensors connected does not yield an exception.
  • When connecting or disconnecting an Orbbec sensor (in my case the Femto Bolt)
  • At application close, when attempting to clean up the global RealSense context.

Here's the call stack fof the crash. It is identical in all cases.
image

After upgrading librealsense to the current development branch, It turns out the exception on shutdown is now handled, but all other cases remain unhandled.

net enumeration doesn't work with femto mega

I'm trying to use the ob_enable_net_device_enumeration command with my Femto Mega camera, but I don't get any response from the camera. I can see the GVCP packets being broadcast, and I can connect to my camera with the IP address, but it won't show up in the device list. This happens with the Orbbec viewer app or my own code.

I don't know if this is because my camera is still 1.1.5 firmware, or if the feature isn't expected to work. I tried to update my firmware using the viewer app, but I can't find the newer version.

Not working in Jetson Orin nano + orbbec gemini2

Hi
I have tried running orbbecSDK in jetson orion nano with gemini2 camera but it doesn't show any data from camera.
The camera connection is successful but no data from colorframe as well as videoframe. It works in jetson nano without issues.
The stdout files are attached.
log.zip

Examples build INSTALL target fails on 1.8.3 (Windows)

Build started at 7:49 PM...
1>------ Build started: Project: INSTALL, Configuration: Release x64 ------
1>1>
1>-- Install configuration: "Release"
1>CMake Error at c/Sample-HelloOrbbec/cmake_install.cmake:39 (file):
1>  file cannot create directory: C:/Program Files/OrbbecSDK-Samples/bin.
1>  Maybe need administrative privileges.
1>Call Stack (most recent call first):
1>  c/cmake_install.cmake:37 (include)
1>  cmake_install.cmake:37 (include)
1>
1>
1>C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppCommon.targets(166,5): error MSB3073: The command "setlocal
1>C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppCommon.targets(166,5): error MSB3073: "C:\Program Files\CMake\bin\cmake.exe" -DBUILD_TYPE=Release -P cmake_install.cmake
1>C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppCommon.targets(166,5): error MSB3073: if %errorlevel% neq 0 goto :cmEnd
1>C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppCommon.targets(166,5): error MSB3073: :cmEnd
1>C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppCommon.targets(166,5): error MSB3073: endlocal & call :cmErrorLevel %errorlevel% & goto :cmDone
1>C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppCommon.targets(166,5): error MSB3073: :cmErrorLevel
1>C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppCommon.targets(166,5): error MSB3073: exit /b %1
1>C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppCommon.targets(166,5): error MSB3073: :cmDone
1>C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppCommon.targets(166,5): error MSB3073: if %errorlevel% neq 0 goto :VCEnd
1>C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppCommon.targets(166,5): error MSB3073: :VCEnd" exited with code 1.
1>Done building project "INSTALL.vcxproj" -- FAILED.
========== Build: 0 succeeded, 1 failed, 34 up-to-date, 0 skipped ==========
========== Build completed at 7:49 PM and took 00.249 seconds ==========

The INSTALL target fails with the above Output.

Write buffer overflow in OpenNIPacketProcessor

Hello.
We are using Gemini-E camera for depth detection.
However, it has problem that the image buffer cannot received sometimes.
The OrbbekSDK log in these situation is as follows.

[1970-01-01 09:12:30.871355][warning][2910][OpenNIDeviceInfo.cpp:179] New openni device matched.
[1970-01-01 09:12:30.871416][info][2910][DeviceManager.cpp:562] Found 2 device(s):
[1970-01-01 09:12:30.871423][info][2910][DeviceManager.cpp:564]   - Name: Gemini E, PID: 0x065C, SN/ID: [PRIVATE]
[1970-01-01 09:12:30.871477][info][2910][DeviceManager.cpp:564]   - Name: Gemini E, PID: 0x065C, SN/ID: [PRIVATE]
[1970-01-01 09:12:30.871570][info][2910][LinuxPal.cpp:109] Create PollingDeviceWatcher!
[1970-01-01 09:12:30.873247][info][2910][OpenNIHostProtocol.cpp:528] Hardware versions: FW=5.8.23 (14), HW=0, Chip=7, Sensor=0, SYS=12
[1970-01-01 09:12:30.874439][error][2910][OpenNIHostProtocol.cpp:545] Get usb core type failed!
[1970-01-01 09:12:30.882812][info][2910][OpenNISensorFirmware.cpp:1020] Sensor serial number:[PRIVATE]
[1970-01-01 09:12:30.885563][info][2910][OpenNISensorFirmware.cpp:1048] Firmware version RD3480
[1970-01-01 09:12:30.886888][info][2910][OpenNISensorFirmware.cpp:1054] Device frequency 31.25
[1970-01-01 09:12:31.297756][info][2910][OpenNISensorFirmware.cpp:145] OpenNI2 camera support Watchdog function.
[1970-01-01 09:12:31.298236][info][2910][OpenNIDevice.cpp:29] OpenNI device created! PID: 0x065c, SN: [PRIVATE]
[1970-01-01 09:12:31.298258][info][2910][DeviceManager.cpp:481] Device created successfully! Name: Gemini E, PID: 0x065c, SN/ID: [PRIVATE]
[1970-01-01 09:12:31.299143][info][2910][OpenNIDevice.cpp:358] Depth sensor has been created!
[1970-01-01 09:12:31.327106][info][2910][OpenNIDevice.cpp:417] Color sensor has been created!
[1970-01-01 09:12:31.330164][info][2910][OpenNIDevice.cpp:464] IR sensor has been created!
[1970-01-01 09:12:31.334694][info][2910][Pipeline.cpp:42] Pipeline created with device: {name: Gemini E, sn: [PRIVATE]}, @0x55A2AF7A00
[1970-01-01 09:12:31.338662][warning][2910][AbstractDevice.cpp:840] Get supported D2C profile is null!
[1970-01-01 09:12:31.338706][info][2910][Pipeline.cpp:236] Try to start streams!
[1970-01-01 09:12:31.338780][info][2910][VideoSensor.cpp:554] start OB_SENSOR_COLOR stream with profile: {type: OB_STREAM_COLOR, format: OB_FORMAT_RGB, width: 640, height: 480, fps: 15}
[1970-01-01 09:12:31.844951][info][2910][Pipeline.cpp:249] Start streams done!
[1970-01-01 09:12:31.844965][info][2910][Pipeline.cpp:232] Pipeline start done!
[1970-01-01 09:12:31.965057][info][2910][OpenNIHostProtocol.cpp:528] Hardware versions: FW=5.8.23 (14), HW=0, Chip=7, Sensor=0, SYS=12
[1970-01-01 09:12:31.974574][error][2910][OpenNIHostProtocol.cpp:545] Get usb core type failed!
[1970-01-01 09:12:32.009740][info][2910][OpenNISensorFirmware.cpp:1020] Sensor serial number:[PRIVATE]
[1970-01-01 09:12:32.014360][info][2910][OpenNISensorFirmware.cpp:1048] Firmware version RD3480
[1970-01-01 09:12:32.017080][info][2910][OpenNISensorFirmware.cpp:1054] Device frequency 31.25
[1970-01-01 09:12:32.401661][info][2910][OpenNISensorFirmware.cpp:145] OpenNI2 camera support Watchdog function.
[1970-01-01 09:12:32.401797][info][2910][OpenNIDevice.cpp:29] OpenNI device created! PID: 0x065c, SN: [PRIVATE]
[1970-01-01 09:12:32.401807][info][2910][DeviceManager.cpp:481] Device created successfully! Name: Gemini E, PID: 0x065c, SN/ID: [PRIVATE]
[1970-01-01 09:12:32.402078][info][2910][OpenNIDevice.cpp:358] Depth sensor has been created!
[1970-01-01 09:12:32.420133][info][2910][OpenNIDevice.cpp:417] Color sensor has been created!
[1970-01-01 09:12:32.427585][info][2910][OpenNIDevice.cpp:464] IR sensor has been created!
[1970-01-01 09:12:32.454293][info][2910][Pipeline.cpp:42] Pipeline created with device: {name: Gemini E, sn: [PRIVATE]}, @0x55A2FB2E90
[1970-01-01 09:12:32.463331][warning][2910][AbstractDevice.cpp:840] Get supported D2C profile is null!
[1970-01-01 09:12:32.463354][info][2910][Pipeline.cpp:236] Try to start streams!
[1970-01-01 09:12:32.463382][info][2910][VideoSensor.cpp:554] start OB_SENSOR_COLOR stream with profile: {type: OB_STREAM_COLOR, format: OB_FORMAT_RGB, width: 640, height: 480, fps: 15}
[1970-01-01 09:12:32.963579][info][2910][Pipeline.cpp:249] Start streams done!
[1970-01-01 09:12:32.963614][info][2910][Pipeline.cpp:232] Pipeline start done!
[1970-01-01 09:12:36.865647][warning][3452][Pipeline.cpp:289] Pipeline source frameset queue fulled, drop the oldest frame!
[1970-01-01 09:12:37.885774][error][3801][OpenNIPacketProcessor.cpp:18] Write buffer overflow!
[1970-01-01 09:12:37.906508][warning][3791][FrameProcessingBlock.cpp:89] Source frameset queue fulled, drop the oldest frame! N11libobsensor15FormatConverterE@0x55a2fa6090
[1970-01-01 09:12:37.962194][warning][3419][FrameProcessingBlock.cpp:89] Source frameset queue fulled, drop the oldest frame! N11libobsensor15FormatConverterE@0x55a2aeaa70
[1970-01-01 09:12:38.789634][error][3426][OpenNIPacketProcessor.cpp:18] Write buffer overflow!
[1970-01-01 09:12:39.159428][error][3801][OpenNIPacketProcessor.cpp:18] Write buffer overflow!
[1970-01-01 09:12:39.917928][warning][3526][Pipeline.cpp:289] Pipeline source frameset queue fulled, drop the oldest frame! [**16 logs in 3052ms**]
[1970-01-01 09:12:40.424876][error][3801][OpenNIPacketProcessor.cpp:18] Write buffer overflow!
[1970-01-01 09:12:40.925246][warning][4311][FrameProcessingBlock.cpp:89] Source frameset queue fulled, drop the oldest frame! N11libobsensor15FormatConverterE@0x55a2fa6090 [**8 logs in 3018ms, last: 09:12:38.205768**]
[1970-01-01 09:12:40.962774][warning][3875][Pipeline.cpp:289] Pipeline source frameset queue fulled, drop the oldest frame!
[1970-01-01 09:12:41.026993][warning][4314][FrameProcessingBlock.cpp:89] Source frameset queue fulled, drop the oldest frame! N11libobsensor15FormatConverterE@0x55a2aeaa70 [**21 logs in 3064ms, last: 09:12:40.853725**]
[1970-01-01 09:12:41.855024][error][3801][OpenNIPacketProcessor.cpp:18] Write buffer overflow!
[1970-01-01 09:12:41.857410][error][3426][OpenNIPacketProcessor.cpp:18] Write buffer overflow!
[1970-01-01 09:12:42.005588][error][3801][OpenNIPacketProcessor.cpp:18] Write buffer overflow!
[1970-01-01 09:12:43.587833][warning][3854][FrameProcessingBlock.cpp:89] Source frameset queue fulled, drop the oldest frame! N11libobsensor15FrameSoftFilterE@0x55a2d88c90
[1970-01-01 09:12:44.087362][warning][4485][Pipeline.cpp:289] Pipeline source frameset queue fulled, drop the oldest frame! [**11 logs in 3124ms, last: 09:12:43.342218**]
[1970-01-01 09:12:44.417056][warning][3451][FrameProcessingBlock.cpp:89] Source frameset queue fulled, drop the oldest frame! N11libobsensor30Disparity2DepthConverterHalleyE@0x55a2ae4af0
[1970-01-01 09:12:45.932297][warning][3526][Pipeline.cpp:289] Pipeline source frameset queue fulled, drop the oldest frame! [**52 logs in 6014ms**]
[1970-01-01 09:12:46.948059][warning][4656][FrameProcessingBlock.cpp:89] Source frameset queue fulled, drop the oldest frame! N11libobsensor15FrameSoftFilterE@0x55a2d88c90 [**19 logs in 3360ms, last: 09:12:46.392565**]
[1970-01-01 09:12:46.975627][warning][3791][FrameProcessingBlock.cpp:89] Source frameset queue fulled, drop the oldest frame! N11libobsensor15FormatConverterE@0x55a2fa6090 [**30 logs in 6050ms**]
[1970-01-01 09:12:47.052407][warning][3419][FrameProcessingBlock.cpp:89] Source frameset queue fulled, drop the oldest frame! N11libobsensor15FormatConverterE@0x55a2aeaa70
[1970-01-01 09:12:47.438268][error][3801][OpenNIPacketProcessor.cpp:18] Write buffer overflow!
[1970-01-01 09:12:47.522890][warning][4829][FrameProcessingBlock.cpp:89] Source frameset queue fulled, drop the oldest frame! N11libobsensor30Disparity2DepthConverterHalleyE@0x55a2ae4af0 [**1 logs in 3105ms, last: 09:12:44.520587**]
[1970-01-01 09:12:48.223398][error][3801][OpenNIPacketProcessor.cpp:18] Write buffer overflow!

We want to know the meaning of "Write buffer overflow!" at these log. And, if possible, We would like to see the source code of OpenNIPacketProcessor.cpp.

Thanks a lot.

※ Environment

  • Gemini-E firmware : RD3480
  • OrbbecSDK version : v1.6.3

RGB Camera calibration missing when not using depth camera

Hey, sorry for bothering again. I noticed a bug where RGB camera calibration is not available if I don't also start the depth camera. Tested on Femto Mega and Astra 2. I am using the latest 1.8.x branch, Ubuntu 20.04 LTS.

To reproduce, add these lines to your ColorViewer.cpp example (distortion coeffs were also missing):

const OBCameraParam &calibration = pipe.getCameraParam();
std::cout << "fx: " << calibration.rgbIntrinsic.fx << std::endl;
std::cout << "fy: " << calibration.rgbIntrinsic.fy << std::endl;
std::cout << "cx: " << calibration.rgbIntrinsic.cx << std::endl;
std::cout << "cy: " << calibration.rgbIntrinsic.cy << std::endl;
std::cout << "width: " << calibration.rgbIntrinsic.width << std::endl;
std::cout << "height:" << calibration.rgbIntrinsic.height << std::endl;

OrbbecViewer cannot find Dabai DCW on ubuntu

Platform: ubuntu 22.04
Device: Dabai DCW

When running Orbbec Viewer, it cannot find my device, but lsusb can display both color camera and depth sensor correctly. Below is the output from terminal and log info from OrbbecSDK.log.txt.

image

image

How to know Gemini E parameters?

I want to know intrinsic parameters of Gemini E in Ubuntu 22.04.
I tried 2 ways to know them.

First, I tried to use API like below code.

int main(int, char**){
ob::Pipeline pipe;
auto device = pipe.getDevice();
auto cameraParamList = device->getCalibrationCameraParamList();
for(int i = 0; i < cameraParamList->count(); i++) {
OBCameraParam obParam = cameraParamList->getCameraParam(i);
std::cout << "fx:" << obParam.depthIntrinsic.fx << " fy: " << obParam.depthIntrinsic.fy << " cx:" << obParam.depthIntrinsic.cx
<< " cy: " << obParam.depthIntrinsic.cy
<< " width: " << obParam.depthIntrinsic.width << " height:" << obParam.depthIntrinsic.height << std::endl;}
}

I got no the no message and I noticed the "cameraParamList->count()" is 0.

Second, I started "OBCommonUsages" in SDK and make it print parameters.
And it says, all parameters are 0 like below.

Input command: p
depthIntrinsic fx:0, fy0, cx:0, cy:0 ,width:0, height:0
rgbIntrinsic fx:0, fy0, cx:0, cy:0, width:0, height:0
depthDistortion k1:0, k2:0, k3:0, k4:0, k5:0, k6:0, p1:0, p2:0
rgbDistortion k1:0, k2:0, k3:0, k4:0, k5:0, k6:0, p1:0, p2:0
transform-rot: [0, 0, 0, 0, 0, 0, 0, 0, 0]
transform-trans: [ 0, 0, 0]

How can I know the intrinsic parameters of my Gemini E?

Orbbec SDK hanging on std::exit

Having a lot of fun with the SDK with the Femto Mega.
One issue I can't seem to resolve is that when my app closes I get a hang no matter how I seem to close it.

Is there a rule in terms of what objects need to get destroyed first?
Or what objects you have to hang onto ( as shared_ptr's ) and what ones you can use just for opening the device?

I should note that my app works great until I try and close it or if I manually call std:exit(0);
I have tried a lot of different combos, but can't get anything working that isn't a single main(){ style app.

Below is my app ( simplified ), it also uses GLFW for windowing and setup and update are all on the main thread.

//--------------------------------------------------------------
void ofApp::setup(){

    std::string ip = "192.168.50.79";

    ctx = make_shared<ob::Context>(); 
    auto device = ctx->createNetDevice(ip.c_str(), 8090); //note: also tried holding this as a shared_ptr in ofApp.h 

    // pass in device to create pipeline
    mPipe = std::make_shared<ob::Pipeline>(device);

    // Create Config for configuring Pipeline work
    std::shared_ptr<ob::Config> config = std::make_shared<ob::Config>();

    // Get the depth camera configuration list
    auto depthProfileList = mPipe->getStreamProfileList(OB_SENSOR_DEPTH);
    // use default configuration
    mDepthProfile = depthProfileList->getProfile(0);
    
    // enable depth stream
    config->enableStream(mDepthProfile);
    config->setAlignMode(ALIGN_DISABLE);

    // Pass in the configuration and start the pipeline
    mPipe->start(config);
    
    auto cameraParam = mPipe->getCameraParam();
    pointCloud.setCameraParam(cameraParam);

}

//--------------------------------------------------------------
void ofApp::update(){

    if( mPipe ){
        auto frameSet = mPipe->waitForFrames(100);
        if(frameSet) {
            auto depthFrame = frameSet->getFrame(OB_FRAME_DEPTH);
            if(depthFrame) {
                auto pix = processFrame(depthFrame);
                outputTex.loadData(pix);

                pointCloud.setCreatePointFormat(OB_FORMAT_POINT);
                std::shared_ptr<ob::Frame> frame = pointCloud.process(frameSet); 
                mPointCloudMesh = pointCloudToMesh(frame);
            }
        }
    }

    if( ofGetKeyPressed('c') ){
         if( mPipe ){
            mPipe->stop();
            mPipe.reset();
            std::exit(0); //<------ HANGS. If commented out stream stops without issue. 
        }
    }
}
#include "ofMain.h"

#include "libobsensor/ObSensor.hpp"
#include "libobsensor/hpp/Error.hpp"
#include <opencv2/opencv.hpp>

class ofApp : public ofBaseApp{

	public:
		void setup();
		void update();
		void draw();
		void exit() override;

		ofPixels processFrame(shared_ptr<ob::Frame> frame);

		void keyPressed(int key);
		void keyReleased(int key);
		void mouseMoved(int x, int y );
		void mouseDragged(int x, int y, int button);
		void mousePressed(int x, int y, int button);
		void mouseReleased(int x, int y, int button);
		void mouseEntered(int x, int y);
		void mouseExited(int x, int y);
		void windowResized(int w, int h);
		void dragEvent(ofDragInfo dragInfo);
		void gotMessage(ofMessage msg);

		std::shared_ptr<ob::Context> ctx;
		//std::shared_ptr<ob::Device> mDevice;
		std::shared_ptr <ob::Pipeline> mPipe;
		std::shared_ptr<ob::StreamProfile> mDepthProfile; 
   		ob::PointCloudFilter pointCloud;

		ofMesh mPointCloudMesh;
		ofTexture outputTex;
		ofEasyCam mCam;
};

Femto Mega, multi device HW sync, auto timestamp correction of subordinate devices is not happening.

I noticed that timestamps is drifting apart in a multi device configuration with sync. cables, one primary, and 1-8 subordinates. I had to detect my self when they drift beond the point where they no longer can be corrected to match the primary + trigger delay, without risking jumping to the timestamp of the next or prev. frame.

Azure kinect's is automatically adjusting the timestamps of subordinates to match the timestamps of the primary, with high accuracy < 60 micro seconds.

Can I expect this to be fixed in the new coming firmware for Femto Mega, or is this something I have to handle my self?

Femto Bolt timestamp reset occasionally not working with no errors

I just ran into an issue when using 3x Femto Bolt cameras where two of them stopped responding to requests to reset the timestamps.

Power cycling the sensors resolved the issue, but it was quite confusing as the SDK reported no errors, but the timestamps were never reset.

This resulted in the entire system being unable to be synchronized, as the timestamps were so far apart.

Femto Bolt is not running at exactly 30.0 FPS

Hello,
I tried 4 different Femto Bolt and they have different FPS. We’re using the last firmware available: 1.0.9.

FPS_CAM1: 30.14
FPS_CAM2: 30.028
FPS_CAM3: 30.063
FPS_CAM4: 30.099
We have measured these values via ‘Orbbec Viewer’ software looking at FrameIndex and Device Timestamp in two different moment T1 and T2:
T1: after 30 seconds the channels Color and Depth have been opened.
T2: after 240 seconds the channels Color and Depth have been opened.
FPS = (FrameIndex@T2 - FrameIndex@T1) / (DeviceTS@T2 - DeviceTS@T1)

I also compared the values of DeviceTimestamp with different stopwatches for a long time and it is very precise, the problem is that I receive a different amount of frames than expected.

We want to replace Microsoft K4Azure with Femto Bolt in all our manufacturing lines but we can’t if the camera doesn’t have a precise FPS of 30.00, because we synchronize it with other sensors and after 1 minute it goes out of sync.

Is there a way to get a precise FPS, at least between 29.999 and 30.001?

Thanks

Femto Mega timestamps synchronization when working as POE devices

I've a setup of 2 Femto Mega devices and I use a syncbox to synchronize them. When I am using the USB interface, the hardware timestamps seem to be updated despite the fact that hardware timestamp seem not to match
usb_hw_sync_confirm

but when I am using the POE interface, a) hardware timestamps are not synced to host PC b) system timestamps are also off
poe_sync

(I am moving a retro-reflective object in the scene at high speed, this is what you see in the pictures)

You can find the code here, which is a modified version of the MultiDeviceSync example to support PoE

Point cloud flipped in Y with 1.6.x -> 1.7.5 SDK

Sorry to open another issue and this might just be an intentional change ( as I only see this in the SDK not the Orbbec Viewer ).

But with identical code that displays the point cloud, switching from 1.6.x to 1.7.5 I see the point cloud inverted in Y.

Screenshot from app using 1.6.x SDK:
Screenshot from 2023-09-27 08-43-32

Screenshot from the same app using 1.7.5. SDK:
Screenshot from 2023-09-27 08-44-10

( note: the point cloud is in white, the gray image is the depth image just as reference )

Feel free to close if this was an intentional change.
Just wanted to post it incase it was a regression.

H264 / H265 decoding example?

I've been trying to decode the H264 color video stream from the Femto Mega using libavcodec and despite following multiple examples for H264 -> RGB, I'm not getting clean frames coming in.

Is it possible to share an example that does the decoding or even post a code snippet of how you are handling the H264 decoding in the Orbbec Viewer?

It would be immensely helpful.

Undistorted point cloud from wide FoV

I’ve been working on a software that perform blob detection using the point clouds from Kinect and I’m working on updating it to work with the latest Orbbec cameras (Femto Bolt and Femto Mega)

I started to get the point cloud from the wide field of view (512 x 512) and when I do, the point cloud is distorted and not properly aligned with real world coordinates.
(A in the image)

To remedy that, Orbbec has a mode to align the depth points with the color camera view (D2C). However, the points that are not within the camera view are discarded, rendering the use of the wide FoV texture useless because a big part of the data is lost.
(B in the image)

I was wondering if there's a way to undistort the point cloud based on world coordinates, like the Azure Kinect SDK had (C in the image), and not based on the RGB camera view? Should I just use the K4A wrapper instead of the Orbbec SDK?

distort

Thank you!

Release Firmware in parallel with SDK releases / where is 1.7.4 compatible firmware?

Currently the only Linux SDK download link on the main Orbbec site is for 1.7.4.

The Femto Mega 1.1.5 firmware does not work with this SDK version ( neither the Orbbec Viewer or the NetDevice.cpp example work with a 1.1.5 firmware camera for network connection ).

I had to clone this repo ( @ 1.6.x ) and use the prebuilt 1.6 Orbbec Viewer to get the NetDevice / Network connection working.
See the discussion here: https://3dclub.orbbec3d.com/t/femto-mega-network/3772/8

In general it would be super helpful if you could archive SDK versions and firmware versions either here or the main site.
As once 1.7.4 was release the 1.6 links disappeared from the main site.

Also do we have an ETA on firmware for Femto Mega that is compatible with 1.7.4 SDK?

Thanks so much!

depthengine.so name collision with Kinect Azure

Is there a way to use Orbbec and Kinect Azure in the same time? The libdepthengine.so.2.0 file is opened from both SDKs, and the file is not the same, Orbbec malfunctions if I use k4a version of libdepthengince.so.2.0, so it seems I have to choose which one I want to use and use only the corresponding shared object file.

Is there a best practice to how to use both devices in the same program in the same time?

Femto Mega duplicate IMU timestamps

I noticed a bug where Femto Mega returns duplicate timestamps for accel & gyro even if I use 200Hz mode. Ubuntu 20.04, SDK 1.8.3, firmware 1.2.7.

I also tried on Astra2, but did not get duplicate timestamps even in 1KHz mode. Also, it would be great if you could fix the issue where IMU timestamps have 1ms accuracy; this prevents us from using the 2KHz IMU mode.

Here is a modified version of your example ImuReader.cpp:

#include <iostream>
#include <mutex>
#include <libobsensor/ObSensor.hpp>
#include "utils.hpp"
#include <map>

#define ESC 27
std::mutex printerMutex;
int        main(int argc, char **argv) try {
    // Print the SDK version number, the SDK version number is divided into major version number, minor version number and revision number
    std::cout << "SDK version: " << ob::Version::getMajor() << "." << ob::Version::getMinor() << "." << ob::Version::getPatch() << std::endl;

    // Create a Context.
    ob::Context ctx;

    // Query the list of connected devices
    auto devList = ctx.queryDeviceList();

    if(devList->deviceCount() == 0) {
        std::cerr << "Device not found!" << std::endl;
        return -1;
    }

    // Create a device, 0 represents the index of the first device
    auto                        dev         = devList->getDevice(0);
    std::shared_ptr<ob::Sensor> gyroSensor  = nullptr;
    std::map<uint64_t, uint64_t> gyroSamples; // <timestamp, index>
    try {
        // Get Gyroscope Sensor
        gyroSensor = dev->getSensorList()->getSensor(OB_SENSOR_GYRO);
        if(gyroSensor) {
            // Get configuration list
            auto profiles = gyroSensor->getStreamProfileList();
            // Select the first profile to open stream
            auto profile = profiles->getGyroStreamProfile(OBGyroFullScaleRange::OB_GYRO_FS_1000dps, OBGyroSampleRate::OB_SAMPLE_RATE_200_HZ);

            gyroSensor->start(profile, [&](std::shared_ptr<ob::Frame> frame) {
                std::unique_lock<std::mutex> lk(printerMutex);
                auto                         timeStamp = frame->timeStampUs();
                auto                         index     = frame->index();
                auto                         gyroFrame = frame->as<ob::GyroFrame>();

                if(gyroFrame) {
                    auto it = gyroSamples.find(timeStamp);
                    if (it != gyroSamples.end()) {
                        std::cout << "Duplicate gyro sample! index0=" << std::to_string(it->second) << ", index1=" << std::to_string(index) << ", timestamp=" << std::to_string(timeStamp) << std::endl;
                    } else {
                        gyroSamples.insert(std::make_pair(timeStamp, index));
                    }
                }
            });
        }
        else {
            std::cout << "get gyro Sensor failed ! " << std::endl;
        }
    }
    catch(ob::Error &e) {
        std::cerr << "current device is not support imu!" << std::endl;
        exit(EXIT_FAILURE);
    }

    std::cout << "Press ESC to exit! " << std::endl;

    while(true) {
        // Get the value of the pressed key, if it is the ESC key, exit the program
        int key = getch();
        if(key == ESC)
            break;
    }

    // turn off the flow
    if(gyroSensor) {
        gyroSensor->stop();
    }
    return 0;
}
catch(ob::Error &e) {
    std::cerr << "function:" << e.getName() << "\nargs:" << e.getArgs() << "\nmessage:" << e.getMessage() << "\ntype:" << e.getExceptionType() << std::endl;
    exit(EXIT_FAILURE);
}

This gives me warning like:
Duplicate gyro sample! index0=163, index1=164, timestamp=25591000

Multiple camera extrinsics calibration

Hi, we recently purchased 4 Femto Bolt, and we calibrated their extrinsic and merged the point cloud from each camera. We successfully merged the point cloud for the calibration board. However, the cup on the table is quite off from different camera views. It seems that the absolute depth value is not accurate. Could you help with this problem?

image

Femto Bolt always using passive IR

We have users running the 1.8.1 SDK with both the Kinect wrapper and the native SDK. With the Femto Mega cameras, the IR and depth cameras work properly and automatically start up in 'positive' IR mode with the red light on the front. However, users with the Femto Bolt camera always seem to start in passive mode and we haven't currently implemented config options to turn this on manually. They can use the positive mode in the OrbbecViewer app.

Is this an expected behaviour or has this been fixed in the 1.8.3 SDK? We have been following the InfraredViewer example that doesn't make any mention of the passive/positive mode.

Orbbec Viewer Crash if another Camera is in Network

The Orbbec viewer finds other cameras in the network and then the programme crashes. This happens when an Allied Vision Technologies Manta camera is in the network. Is there any workaround? It doesnt work if i deactivate the EnumerateNetDevice setting.

Exception:

[03/18 11:53:31.341881][debug][10708][Context.cpp:31] Context creating. work_dir=C:\send\OrbbecViewer_v1.9.5_202403051246_win_x64_release
[03/18 11:53:31.341947][debug][10708][Context.cpp:50] Config file version=1.1
[03/18 11:53:31.341974][debug][10708][FrameBufferManager.cpp:23] Max global frame buffer size updated! size=2048.000MB
[03/18 11:53:31.341985][info][10708][Context.cpp:69] Context created with config: C:\send\OrbbecViewer_v1.9.5_202403051246_win_x64_release\OrbbecSDKConfig_v1.0.xml
[03/18 11:53:31.342030][info][10708][Context.cpp:74] Context work_dir=C:\send\OrbbecViewer_v1.9.5_202403051246_win_x64_release
[03/18 11:53:31.342229][info][10708][Context.cpp:77] - SDK version: 1.9.5
[03/18 11:53:31.342256][info][10708][Context.cpp:78] - SDK stage version: main
[03/18 11:53:31.342287][info][10708][Context.cpp:82] get config EnumerateNetDevice:false
[03/18 11:53:31.342311][debug][10708][DeviceManager.cpp:30] DeviceManager init ...
[03/18 11:53:31.342318][info][10708][MfPal.cpp:102] createObPal: create WinPal!
[03/18 11:53:31.342342][debug][10708][MfPal.cpp:107] WmfPal init ...
[03/18 11:53:31.358040][debug][10708][MfPal.cpp:114] WmfPal created!
[03/18 11:53:31.358131][debug][10708][DeviceManager.cpp:34] Enable USB Device Enumerator ...
[03/18 11:53:31.368784][debug][10708][EnumeratorLibusb.cpp:323] queryDevicesInfo done!
[03/18 11:53:31.368866][debug][10708][MfPal.cpp:166] Create WinEventDeviceWatcher!
[03/18 11:53:31.368897][debug][10708][UsbDeviceEnumerator.cpp:76] No matched usb device found!
[03/18 11:53:31.368918][info][10708][DeviceManager.cpp:15] Current found device(s): (0)
[03/18 11:53:31.369006][debug][10708][DeviceManager.cpp:55] DeviceManager construct done!
[03/18 11:53:31.369067][info][10708][DeviceManager.cpp:310] Enable net device enumeration: true
[03/18 11:53:31.377376][info][10708][GVCPClient.cpp:223] bind 192.168.3.81:0
[03/18 11:53:31.377494][info][10708][GVCPClient.cpp:223] bind 192.168.5.80:0
[03/18 11:53:31.377545][info][10708][GVCPClient.cpp:223] bind 192.168.3.218:0
[03/18 11:53:31.398353][info][2032][GVCPClient.cpp:310] 0, 3, 248, 1
[03/18 11:53:31.398462][info][2032][GVCPClient.cpp:352] 2,1, [MAC-Address],7, 5,0, 192.168.5.197, 255.255.255.0, 0.0.0.0, Allied Vision Technologies, Manta_G-046C (E0020005), 00.01.44.31238 , Manta_G-046C|E0020005|, 50-0503344713, 9672e6ee-9f79
[03/18 11:53:32.419788][info][10708][GVCPClient.cpp:76] - mac:[MAC-Address], ip:192.168.5.197, sn:50-0503344713, pid:0x0000
[03/18 11:53:34.413202][info][5040][GVCPClient.cpp:352] 2,1, :[MAC-Address],7, 5,0, 192.168.5.196, 255.255.255.0, 0.0.0.0, Allied Vision Technologies, Manta_G-046C (E0020005), 00.01.44.31238 , Manta_G-046C|E0020005|, 50-0503345719, d136b4db-ed5e [5 logs in 3014ms, last: 11:53:31.416067]
[03/18 11:53:34.413203][info][5008][GVCPClient.cpp:310] 0, 3, 248, 1 [5 logs in 3014ms, last: 11:53:31.416063]
[03/18 11:53:34.428283][warning][10708][ObException.cpp:5] VendorTCPClient: Connect to server failed! addr=192.168.5.197, port=8090, err=socket is not ready & timeout
[03/18 11:54:49.813995][debug][2216][Context.cpp:31] Context creating. work_dir=C:\send\OrbbecViewer_v1.9.5_202403051246_win_x64_release
[03/18 11:54:49.814160][debug][2216][Context.cpp:50] Config file version=1.1
[03/18 11:54:49.814193][debug][2216][FrameBufferManager.cpp:23] Max global frame buffer size updated! size=2048.000MB
[03/18 11:54:49.814204][info][2216][Context.cpp:69] Context created with config: C:\send\OrbbecViewer_v1.9.5_202403051246_win_x64_release\OrbbecSDKConfig_v1.0.xml
[03/18 11:54:49.814290][info][2216][Context.cpp:74] Context work_dir=C:\send\OrbbecViewer_v1.9.5_202403051246_win_x64_release
[03/18 11:54:49.814791][info][2216][Context.cpp:77] - SDK version: 1.9.5
[03/18 11:54:49.814872][info][2216][Context.cpp:78] - SDK stage version: main
[03/18 11:54:49.814973][info][2216][Context.cpp:82] get config EnumerateNetDevice:false
[03/18 11:54:49.815156][debug][2216][DeviceManager.cpp:30] DeviceManager init ...
[03/18 11:54:49.815171][info][2216][MfPal.cpp:102] createObPal: create WinPal!
[03/18 11:54:49.815246][debug][2216][MfPal.cpp:107] WmfPal init ...
[03/18 11:54:49.830331][debug][2216][MfPal.cpp:114] WmfPal created!
[03/18 11:54:49.830365][debug][2216][DeviceManager.cpp:34] Enable USB Device Enumerator ...
[03/18 11:54:49.839922][debug][2216][EnumeratorLibusb.cpp:323] queryDevicesInfo done!
[03/18 11:54:49.839978][debug][2216][MfPal.cpp:166] Create WinEventDeviceWatcher!
[03/18 11:54:49.840003][debug][2216][UsbDeviceEnumerator.cpp:76] No matched usb device found!
[03/18 11:54:49.840020][info][2216][DeviceManager.cpp:15] Current found device(s): (0)
[03/18 11:54:49.840062][debug][2216][DeviceManager.cpp:55] DeviceManager construct done!
[03/18 11:54:49.840084][info][2216][DeviceManager.cpp:310] Enable net device enumeration: true
[03/18 11:54:49.846335][info][2216][GVCPClient.cpp:223] bind 192.168.3.81:0
[03/18 11:54:49.846461][info][2216][GVCPClient.cpp:223] bind 192.168.5.80:0
[03/18 11:54:49.846515][info][2216][GVCPClient.cpp:223] bind 192.168.3.218:0
[03/18 11:54:49.867365][info][3012][GVCPClient.cpp:310] 0, 3, 248, 1
[03/18 11:54:49.867533][info][3012][GVCPClient.cpp:352] 2,1, [MAC-Address],7, 5,0, 192.168.5.197, 255.255.255.0, 0.0.0.0, Allied Vision Technologies, Manta_G-046C (E0020005), 00.01.44.31238 , Manta_G-046C|E0020005|
[03/18 11:54:50.897306][info][2216][GVCPClient.cpp:76] - mac:[Mac-Address], ip:192.168.5.197, sn:[sn], pid:0x0000
[03/18 11:54:52.868582][info][9908][GVCPClient.cpp:310] 0, 3, 248, 1 [5 logs in 3001ms, last: 11:54:49.886711]
[03/18 11:54:52.868576][info][2964][GVCPClient.cpp:352] 2,1, [Mac-Address],7, 5,0, 192.168.5.196, 255.255.255.0, 0.0.0.0, Allied Vision Technologies, Manta_G-046C (E0020005), 00.01.44.31238 , Manta_G-046C|E0020005|, [5 logs in 3000ms, last: 11:54:49.886715]
[03/18 11:54:52.898593][warning][2216][ObException.cpp:5] VendorTCPClient: Connect to server failed! addr=192.168.5.197, port=8090, err=socket is not ready & timeout

Device over network documentation/support?

The femto devices are marketed as having ethernet support as well as uvc, but I'm struggling to find any information on it;

  • Which platforms does the sdk support for network access?
  • What format is the data flying over the network?
  • I'm seeing macos not being supported in the past, but not going forward? but that seems to be specifically for uvc?
  • Is the network data platform agnostic?

I'm trying to establish if any of your devices are able to replace my current setup of kinect+jetson to stream depth data over network to an platform agnostic end point, but your documentation/tutorials/readme are lacking on gritty details :)

After upgrading the firmware of several Femto Bolt sensors, two of them reported having the same serial number

I performed a firmware upgrade (v1.0.6 -> v1.0.9) of three Orbbec Femto Bolt sensors, with all 3 connected to the system at the same time. I ran into the issue #42 while doing this, but I was still able to push the button, even though it was visually obscured.

After finishing the upgrade on all three sensors, I noticed that two of the sensors appeared differently in the device dropdown. One was identified by its S/N, while the other two were identified by USB IDs. Unfortunately I do not have a screenshot of this.

I tried disconnecting them from the host computer's USB ports and reconnecting them, but this did not resolve the issue.

I finally tried power cycling the sensors (disconnecting both USB and power), and upon reconnecting them the issue was resolved.

color_view

I ran the colorview in your example and it works properly. Then, I copied the code from colorview.cpp into my workspace and configured cmakelist. I also compiled it without any errors, but during runtime,",

[01/02 22:08:50.492530] [info] [10809] [OpenNIDevice. cpp: 424] Color sensor has been created!

[01/02 22:08:50.597351] [warning] [10809] [Pipeline. cpp: 327] Wait for frame timeout, you can try to increase the wait time! Current timeout=100

[01/02 22:08:53.601348] [warning] [10809] [Pipeline. cpp: 327] Wait for frame timeout, you can try to increase the wait time! Current timeout=100 [* * 30 logs in 3003ms * *]

This kind of super report is wrong. Do you know why? I am using an arm32v7l Raspberry Pi device.

The main issue is that when I run your example colorview, it can run normally, but when I copy the code from your case into my workspace and compile it to run the file, there will be an error

RGB image exposure time and gain

Hey,

It would be very useful for us to have access to image exposure time and gain values. Currently, we can try to query the exposure time and gain from the device like this:

device->getIntProperty(OB_PROP_COLOR_EXPOSURE_INT)
device->getIntProperty(OB_PROP_COLOR_GAIN_INT)

and these work when we are not using auto exposure. (One issue is that the exposure time unit changes between devices; Astra2 uses microseconds, Femto Mega uses 0.1ms).

However, with AE enabled, these values are not updated at all and so we cannot even get rough values for exposure time and gain. Ideally, we would like to get image specific values, i.e. could you consider adding these outputs to OrbbecSDK when you implement this support in Orbbec K4A Wrapper? k4a_image_get_exposure_usec, k4a_image_get_iso_speed

Obtain extrinsic parameters

I'm using Gemini 2 for capturing RGB-D images with Orbbec SDK, but I want to obtain the camera pose directly from the extrinsic parameters without using COLMAP.

How to achieve that?

Under which license does libOrbbecSDK.so use libudev?

Hi,

I am trying to figure out the license notices that would be required when releasing a Linux application that redistributes, among other libraries, libOrbbecSDK.so. Going through the libraries, I noticed that libOrbbecSDK.so seems to link against libudev.so.1, if I interpret this correctly:

$ readelf -d libOrbbecSDK.so

Dynamic section at offset 0x8fcb78 contains 35 entries:
  Tag        Type                         Name/Value
 0x0000000000000001 (NEEDED)             Shared library: [libpthread.so.0]
 0x0000000000000001 (NEEDED)             Shared library: [librt.so.1]
 0x0000000000000001 (NEEDED)             Shared library: [libdl.so.2]
 0x0000000000000001 (NEEDED)             Shared library: [libudev.so.1]
 0x0000000000000001 (NEEDED)             Shared library: [libstdc++.so.6]
 0x0000000000000001 (NEEDED)             Shared library: [libm.so.6]
 0x0000000000000001 (NEEDED)             Shared library: [libgcc_s.so.1]
 0x0000000000000001 (NEEDED)             Shared library: [libc.so.6]
 0x0000000000000001 (NEEDED)             Shared library: [ld-linux-x86-64.so.2]
 [...]

libudev seems to be a part of systemd. Looking at the systemd licenses Readme, it says:

some udev sources under src/udev/ are licensed under GPL-2.0-or-later, so the udev binaries as a whole are also distributed under GPL-2.0-or-later.

Looking at the libudev1 package in Ubuntu 22.04, its copyright file also says that some GPL-only source files are used by that package.

Thus, it seems to me that libudev.so.1 is a GPL-licensed library. As far as I can tell, any program that uses a GPL-licensed library has to be under the GPL as a whole, even if linking to it dynamically (question 1, question 2).

If that were true, then I think that the SDK, depth engine, and programs using the SDK would all have to be licensed under the GPL with their full source code available, which I don't think is the case, so I am clearly missing something here. Would it thus be possible to clarify under which license libudev is used by the SDK? Thank you in advance.

Gain value not applied if set before starting color stream

Ubuntu 20.04 LTS
SDK 1.9.3 (Also 1.8.3)
Femto Mega, FW 1.2.7, EDIT: also tested with 1.2.8
Astra 2, FW 2.8.20

We try to record videos with AE disabled, and set fixed exposure time & gain, but there is a bug, where device->setIntProperty(OB_PROP_COLOR_GAIN_INT, value) is not applied if it is set before starting the color camera for the first time. All other properties we've tested (exposure time, brightness, white balance) do work.

This is also visible in the OrbbecViewer:
https://github.com/orbbec/OrbbecSDK/assets/46484036/03a5544e-3847-4265-95a5-eff906943ca1

As a workaround, we can start the color stream once and then turn it off, and then gain will be applied correctly in the next recording (but only if using a different gain value).

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.