Code Monkey home page Code Monkey logo

Comments (24)

pjrambo avatar pjrambo commented on July 1, 2024 6

The Realsense D435I config has been uploaded. In the realsense_d435i folder, there is a file rs_camera.launch which is realsense ros wrapper's launch file. It may helpful for you guys who are using Realsense D435I

from vins-fusion.

pjrambo avatar pjrambo commented on July 1, 2024 5

According to my test, D435i is enough for test. I have used a roughly calibrated extrinsic parameters to test. The result is fine. The RMSE is 0.12m in a 77.5m length run. I also use it to hover the quadrotor and it can hover stablely.
For the intrinsic parameters, I directly use the parameters from the /camera/infra1/rect_image_raw/camera_info. For the extrinsic parameters, using VINS-Mono to estimate it with "estimate_extrinsic" set to 2.
I think you can get a better result, if you calibrate the intrinsic and extrinsic parameters yourself carefully.

Other things need to be noticed:
If you are using Inter Realsense D435i, you need to set these realsense launch parameters like below:
"unite_imu_method" = "linear_interpolation"
"gyro_fps" = "200"
"accel_fps" = "250"
"enable_sync" = "true"
"enable_imu" = "true"

A well calibrated result may be uploaded latter.

from vins-fusion.

SteveMacenski avatar SteveMacenski commented on July 1, 2024 1

They're not, RGB is rolling, see the manual.
image

the RGB camera isnt synchronized to the IMU - but the IR is.

from vins-fusion.

SteveMacenski avatar SteveMacenski commented on July 1, 2024 1

I haven't yet, but its on my agenda of things to do.

from vins-fusion.

marufino avatar marufino commented on July 1, 2024 1

I used the calibration files (left.yaml and right.yaml) from that repo and the yaml from this github
I had to modify the body_T_cam matrices in my case, since the imu_optical_frame and the infra_optical_frame seem to be aligned

realsense_stereo_imu_config.yaml

%YAML:1.0

#common parameters
#support: 1 imu 1 cam; 1 imu 2 cam: 2 cam; 
imu: 1
num_of_cam: 2

imu_topic: "/camera/imu"
image0_topic: "/camera/infra1/image_rect_raw"
image1_topic: "/camera/infra2/image_rect_raw"
output_path: "/home/dji/output/"

cam0_calib: "left.yaml"
cam1_calib: "right.yaml"
image_width: 640
image_height: 480
   

# Extrinsic parameter between IMU and Camera.
estimate_extrinsic: 1   # 0  Have an accurate extrinsic parameters. We will trust the following imu^R_cam, imu^T_cam, don't change it.
                        # 1  Have an initial guess about extrinsic parameters. We will optimize around your initial guess.

body_T_cam0: !!opencv-matrix
   rows: 4
   cols: 4
   dt: d
   data: [  1,0,0,0.00552,
            0,1,0,-0.0051,
            0,0,1,-0.01174,
            0., 0., 0., 1. ]

body_T_cam1: !!opencv-matrix
   rows: 4
   cols: 4
   dt: d
   data: [ 1,0,0,-0.04464144,
           0,1,0,-0.0051,
           0,0,1,-0.01174,
           0., 0., 0., 1. ]

#Multiple thread support
multiple_thread: 1

#feature traker paprameters
max_cnt: 150            # max feature number in feature tracking
min_dist: 30            # min distance between two features 
freq: 10                # frequence (Hz) of publish tracking result. At least 10Hz for good estimation. If set 0, the frequence will be same as raw image
F_threshold: 1.0        # ransac threshold (pixel)
show_track: 1           # publish tracking image as topic
flow_back: 1            # perform forward and backward optical flow to improve feature tracking accuracy

#optimization parameters
max_solver_time: 0.04  # max solver itration time (ms), to guarantee real time
max_num_iterations: 8   # max solver itrations, to guarantee real time
keyframe_parallax: 10.0 # keyframe selection threshold (pixel)

#imu parameters       The more accurate parameters you provide, the better performance
acc_n: 0.1          # accelerometer measurement noise standard deviation. #0.2   0.04
gyr_n: 0.01         # gyroscope measurement noise standard deviation.     #0.05  0.004
acc_w: 0.001         # accelerometer bias random work noise standard deviation.  #0.002
gyr_w: 0.0001       # gyroscope bias random work noise standard deviation.     #4.0e-5
g_norm: 9.805         # gravity magnitude

#unsynchronization parameters
estimate_td: 1                      # online estimate time offset between camera and imu
td: -0.072                             # initial value of time offset. unit: s. readed image clock + td = real image clock (IMU clock)

#loop closure parameters
load_previous_pose_graph: 0        # load and reuse previous pose graph; load from 'pose_graph_save_path'
pose_graph_save_path: "/home/dji/output/pose_graph/" # save and load path
save_image: 0                   # save image in pose graph for visualization prupose; you can close this function by setting 0 

from vins-fusion.

SkyPigZhu avatar SkyPigZhu commented on July 1, 2024

I've tested D435i, but there're some problems,Vins can't work correctly

from vins-fusion.

marcelinomalmeidan avatar marcelinomalmeidan commented on July 1, 2024

I have tried D435i with VINS-mono, it didn't work. It means that the IMU is not well synchronized with the camera.
I will be following this topic in the hope that someone will make this work in the future! ;-)
BTW, Intel will soon release a RGB-D+IMU camera that runs SLAM onboard: https://realsense.intel.com/tracking-camera/
It would be great to see how the T265 will compare with VINS-Fusion!

from vins-fusion.

SteveMacenski avatar SteveMacenski commented on July 1, 2024

@marcelinomalmeidan did you try using the RGB or the IR sensors? RGB on the D435i arent rolling shutter so I wouldnt expect that to work well and the RGB imager isnt time synchronized against the IMU according to Intel folks.

from vins-fusion.

marcelinomalmeidan avatar marcelinomalmeidan commented on July 1, 2024

@SteveMacenski If I understand well the specifications here, both the IR and RGB are globally shuttered. I might be wrong, it is not clear in the description.
I have only tested with the RGB camera, though...
I used kalibr to calibrate the camera w.r.t. the IMU, and the output was that the camera had a delay w.r.t. the IMU of about 0.11 seconds (yeah, this much!!). So this confirms that RGB is not in sync with the IMU.
BTW, Intel haven't even released official ROS wrappers for capturing IMU data (at least that wasn't possible a couple of weeks ago). We have to use their development branch for that.

from vins-fusion.

marcelinomalmeidan avatar marcelinomalmeidan commented on July 1, 2024

Oh, thanks for pointing all of that out! I moved away from VINS because of these issues, I might get back to it some time in the future. Have you tried VINS with the D435i?

from vins-fusion.

Changliu52 avatar Changliu52 commented on July 1, 2024

from vins-fusion.

pjrambo avatar pjrambo commented on July 1, 2024

When I do the flying test, I do not turn off the IR projector. But I think it will influence the tracking result if you turn it on, especially in a narrow environment which means most of the features are too close to your projector.

from vins-fusion.

BarzelS avatar BarzelS commented on July 1, 2024

The Realsense D435I config has been uploaded. In the realsense_d435i folder, there is a file rs_camera.launch which is realsense ros wrapper's launch file. It may helpful for you guys who are using Realsense D435I

I used all the files you uploaded to the folder and I'm getting bad results, the second I'm moving the camera from the table hugh drift starts:
image

Maybe I need to perform calibration by myself ?
How do I do it?

Thanks

from vins-fusion.

jingshaojing avatar jingshaojing commented on July 1, 2024

According to my test, D435i is enough for test. I have used a roughly calibrated extrinsic parameters to test. The result is fine. The RMSE is 0.12m in a 77.5m length run. I also use it to hover the quadrotor and it can hover stablely.
For the intrinsic parameters, I directly use the parameters from the /camera/infra1/rect_image_raw/camera_info. For the extrinsic parameters, using VINS-Mono to estimate it with "estimate_extrinsic" set to 2.
I think you can get a better result, if you calibrate the intrinsic and extrinsic parameters yourself carefully.

Other things need to be noticed:
If you are using Inter Realsense D435i, you need to set these realsense launch parameters like below:
"unite_imu_method" = "linear_interpolation"
"gyro_fps" = "200"
"accel_fps" = "250"
"enable_sync" = "true"
"enable_imu" = "true"

Should I change "/realsense/realsense2_camera/launch/rs_camrea.launch " to "src/VINS-Fusion/config/realsense_d435i/rs_camera.launch" , when i changed it like this ,the error:

roslaunch realsense2_camera rs_camera.launch
... logging to /home/buaa/.ros/log/30b8c662-d88d-11ea-b265-ccf9e45d6754/roslaunch-lenovo-4048.log
Checking log directory for disk usage. This may take a while.
Press Ctrl-C to interrupt
Done checking log file disk usage. Usage is <1GB.

RLException: unused args [infra2_width, infra1_fps, infra2_fps, infra1_width, enable_imu, infra2_height, infra1_height] for include of [/home/buaa/catkinlib_ws/src/realsense/realsense2_camera/launch/includes/nodelet.launch.xml]
The traceback for the exception was written to the log file

from vins-fusion.

Joeyyuenjoyslife avatar Joeyyuenjoyslife commented on July 1, 2024

For the intrinsic parameters, I directly use the parameters from the /camera/infra1/rect_image_raw/camera_info. For the extrinsic parameters, using VINS-Mono to estimate it with "estimate_extrinsic" set to 2.

Could you share which version of the librealsense and realsense-ros-wrapper, also the firmware are used in your test ? thanks a lot.

from vins-fusion.

marufino avatar marufino commented on July 1, 2024

@jingshaojing to use the rs_camera.launch file included with VINS alongside the latest realsense drivers, change infra{N}_width to infra_width, infra{N}_height to infra_height, infra{N}_fps to infra_fps and remove enable_imu

from vins-fusion.

BarzelS avatar BarzelS commented on July 1, 2024

@jingshaojing to use the rs_camera.launch file included with VINS alongside the latest realsense drivers, change infra{N}_width to infra_width, infra{N}_height to infra_height, infra{N}_fps to infra_fps and remove enable_imu

  1. Are you getting any stable results using the d435i?
  2. By removing the "enable imu" you are not using the Imu for the fusion, right?
  3. Could you share which version of the librealsense and realsense-ros-wrapper, also the firmware are used in your test ? thanks a lot.

from vins-fusion.

marufino avatar marufino commented on July 1, 2024
  1. still sorting through the setup, but no luck so far
  2. the new realsense-ros-wrapper seems to use the unite_imu_method arg to enable and disable the imu. So even without the enable_imu parameter, i'm getting imu output
  3. I was using an old commit from the development branch of https://github.com/IntelRealSense/realsense-ros with the latest commit I had to explicitly set enable_gyro and enable_accel on the VINS rs_camera.launch . I think i'm using librealsense 1.3.12

from vins-fusion.

marufino avatar marufino commented on July 1, 2024

@DronistB I got it working pretty well by following the instructions here: https://github.com/engcang/vins-application/tree/Intel-D435i
A big thing seems to be turning off auto-exposure and the IR emitter

from vins-fusion.

BarzelS avatar BarzelS commented on July 1, 2024

@DronistB I got it working pretty well by following the instructions here: https://github.com/engcang/vins-application/tree/Intel-D435i
A big thing seems to be turning off auto-exposure and the IR emitter

Intersting.. I've tried it as well but got bad results, did you perform some calibration using the kalibr tool or something like this?
Did you use the yaml file in this github?

from vins-fusion.

IshitaTakeshi avatar IshitaTakeshi commented on July 1, 2024

I think it is better to calibrate the sensor itself first.
I followed this calibration guide and the stability has been significantly improved.

https://dev.intelrealsense.com/docs/intel-realsensetm-d400-series-calibration-tools-user-guide

from vins-fusion.

robin-shaun avatar robin-shaun commented on July 1, 2024

I used the calibration files (left.yaml and right.yaml) from that repo and the yaml from this github
I had to modify the body_T_cam matrices in my case, since the imu_optical_frame and the infra_optical_frame seem to be aligned

realsense_stereo_imu_config.yaml

%YAML:1.0

#common parameters
#support: 1 imu 1 cam; 1 imu 2 cam: 2 cam; 
imu: 1
num_of_cam: 2

imu_topic: "/camera/imu"
image0_topic: "/camera/infra1/image_rect_raw"
image1_topic: "/camera/infra2/image_rect_raw"
output_path: "/home/dji/output/"

cam0_calib: "left.yaml"
cam1_calib: "right.yaml"
image_width: 640
image_height: 480
   

# Extrinsic parameter between IMU and Camera.
estimate_extrinsic: 1   # 0  Have an accurate extrinsic parameters. We will trust the following imu^R_cam, imu^T_cam, don't change it.
                        # 1  Have an initial guess about extrinsic parameters. We will optimize around your initial guess.

body_T_cam0: !!opencv-matrix
   rows: 4
   cols: 4
   dt: d
   data: [  1,0,0,0.00552,
            0,1,0,-0.0051,
            0,0,1,-0.01174,
            0., 0., 0., 1. ]

body_T_cam1: !!opencv-matrix
   rows: 4
   cols: 4
   dt: d
   data: [ 1,0,0,-0.04464144,
           0,1,0,-0.0051,
           0,0,1,-0.01174,
           0., 0., 0., 1. ]

#Multiple thread support
multiple_thread: 1

#feature traker paprameters
max_cnt: 150            # max feature number in feature tracking
min_dist: 30            # min distance between two features 
freq: 10                # frequence (Hz) of publish tracking result. At least 10Hz for good estimation. If set 0, the frequence will be same as raw image
F_threshold: 1.0        # ransac threshold (pixel)
show_track: 1           # publish tracking image as topic
flow_back: 1            # perform forward and backward optical flow to improve feature tracking accuracy

#optimization parameters
max_solver_time: 0.04  # max solver itration time (ms), to guarantee real time
max_num_iterations: 8   # max solver itrations, to guarantee real time
keyframe_parallax: 10.0 # keyframe selection threshold (pixel)

#imu parameters       The more accurate parameters you provide, the better performance
acc_n: 0.1          # accelerometer measurement noise standard deviation. #0.2   0.04
gyr_n: 0.01         # gyroscope measurement noise standard deviation.     #0.05  0.004
acc_w: 0.001         # accelerometer bias random work noise standard deviation.  #0.002
gyr_w: 0.0001       # gyroscope bias random work noise standard deviation.     #4.0e-5
g_norm: 9.805         # gravity magnitude

#unsynchronization parameters
estimate_td: 1                      # online estimate time offset between camera and imu
td: -0.072                             # initial value of time offset. unit: s. readed image clock + td = real image clock (IMU clock)

#loop closure parameters
load_previous_pose_graph: 0        # load and reuse previous pose graph; load from 'pose_graph_save_path'
pose_graph_save_path: "/home/dji/output/pose_graph/" # save and load path
save_image: 0                   # save image in pose graph for visualization prupose; you can close this function by setting 0 

The modified body_T_cam is good.

from vins-fusion.

robin-shaun avatar robin-shaun commented on July 1, 2024

realsense_d435i.zip
This is my config file and it works for my camera.

from vins-fusion.

snakehaihai avatar snakehaihai commented on July 1, 2024

D435i alone as an issue when IMU is not calibrated or simply IMU is not good enough.
I tested the D435i with Stereo only, the result does not drift,
with stereo+imu. the result jumps.

So I end up use the DJI A3/N3 400 hz IMU reading with D435i stereo. the result is stable.

from vins-fusion.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.