Code Monkey home page Code Monkey logo

kaistviodataset's Introduction

KAIST VIO dataset (RA-L'21 w/ ICRA Option)

Official page of "Run Your Visual-Inertial Odometry on NVIDIA Jetson: Benchmark Tests on a Micro Aerial Vehicle", which is accepted by RA-L with ICRA'21 option

video
arxiv journal


This is the dataset for testing the robustness of various VO/VIO methods, acquired on a UAV.

You can download the whole dataset on KAIST VIO dataset



Contents

  1. Trajectories
  2. Downloads
  3. Dataset format
  4. Setup
  5. Citation

Trajectories


  • Four different trajectories:
    • circle
    • infinity
    • square
    • pure_rotation
  • Each trajectory has three types of sequence:
    • normal speed
    • fast speed
    • rotation
  • The pure rotation sequence has only normal speed, fast speed types

Downloads

You can download a single ROS bag file from the link below. (or whole dataset from KAIST VIO dataset)

Trajectory Type ROS bag download
circle normal
fast
rotation
link
link
link
infinity normal
fast
rotation
link
link
link
square normal
fast
rotation
link
link
link
rotation normal
fast
link
link



Dataset format


  • Each set of data is recorded as a ROS bag file.
  • Each data sequence contains the followings:
    • stereo infra images (w/ emitter turned off)
    • mono RGB image
    • IMU data (3-axes accelerometer, 3-axes gyroscopes)
    • 6-DOF Ground-Truth
  • ROS topic
    • Camera(30 Hz): "/camera/infra1(2)/image_rect_raw/compressed", "/camera/color/image_raw/compressed"
    • IMU(100 Hz): "/mavros/imu/data"
    • Ground-Truth(50 Hz): "/pose_transformed"
  • In the config directory
    • trans-mat.yaml: translational matrix between the origin of the Ground-Truth and the VI sensor unit.
      (the offset has already been applied to the bag data, and this YAML file has estimated offset values, just for reference. To benchmark your VO/VIO method more accurately, you can use your alignment method with other tools, like origin alignment or Umeyama alignment from evo)
    • imu-params.yaml: estimated noise parameters of Pixhawk 4 mini
    • cam-imu.yaml: Camera intrinsics, Camera-IMU extrinsics in kalibr format
  • publish ground truth as trajectory ROS topic
    • ground truth are recorded as 'geometry_msgs/PoseStamped'
    • but you may want to acquire 'nav_msgs/Path' rather than just 'Pose' for visuaslization purpose (e.g. Rviz)
    • For this, you can refer this package for 'geometry_msgs/PoseStamped'->'nav_msgs/Path': tf_to_trajectory



Setup

- Hardware


                                                            Fig.1 Lab Environment                                        Fig.2 UAV platform
  • VI sensor unit
    • camera: Intel Realsense D435i (640x480 for infra 1,2 & RGB images)
    • IMU: Pixhawk 4 mini
    • VI sensor unit was calibrated by using kalibr

  • Ground-Truth
    • OptiTrack PrimeX 13 motion capture system with six cameras was used
    • including 6-DOF motion information.

- Software (VO/VIO Algorithms): How to set each (publicly available) algorithm on the jetson board

VO/VIO Setup link
VINS-Mono link
ROVIO link
VINS-Fusion link
Stereo-MSCKF link
Kimera link

Citation

If you use the algorithm in an academic context, please cite the following publication:

@article{jeon2021run,
  title={Run your visual-inertial odometry on NVIDIA Jetson: Benchmark tests on a micro aerial vehicle},
  author={Jeon, Jinwoo and Jung, Sungwook and Lee, Eungchang and Choi, Duckyu and Myung, Hyun},
  journal={IEEE Robotics and Automation Letters},
  volume={6},
  number={3},
  pages={5332--5339},
  year={2021},
  publisher={IEEE}
}

Lisence

This datasets are released under the Creative Commons license (CC BY-NC-SA 3.0), which is free for non-commercial use (including research).

kaistviodataset's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

kaistviodataset's Issues

about evaluation using evo

I want to evaluate results output by vins using this dataset. I use the evo tool to do this work, but I am confused about the alignment between the groundtruth and the vins output trajectory.
here are my evaluation results using different alignment options:

Screenshot from 2023-03-13 20-19-12

note that "circle_vins.bag" contains the trajectory output by vins and the gt in dataset.
I have got different rmse results and I wanna know which one is corresponding to the results presented in the paper?
To be specific, these are some qustions:

  1. Has the groundtruth in dataset been aligned?
  2. Which one of the above results correspoing to the results presented in the paper.

about the topic

when I run VINS-FUSION, error is happened as shown in the picture:
image

when we run dataset "kaistviodataset", our "realsense_stereo_imu_config.yaml" is shown in the picture:
image

Question about hardware sync

Hi. I also try to use pixhawk4 mini imu and realsense D435i to run vio. But I want to know whether D435i can be synchronized by pixhawk imu. I have saw that px4 user guide has a chapter about "Camera-IMU sync" (https://docs.px4.io/main/en/peripherals/camera.html). It seems like that pixhawk4 mini can trigger camera capture using Main PWM pin.
I also search some materials about realsense d400 series hw sync information. And I find that D435 can be triggered by external signal using head pin.
I want to know have you configured the devices like above or tried other method to make hardware sync.
If without hardware sync, will the vio algorithm be less accurate and robust?
I hope for your answer, thx!:)

How to calibrate Pixhawk IMU and Realsense Camera?

First of all, thank you for sharing your nice work.

We are also testing the VIO algorithm for autonomous flight using the Realsense D-435i.

By the way, the rosbag data you have shared publishes the Pixhawk IMU to feed these data to the VIO algorithm.

So we have two question about that.

  1. Is there any reason to use a Pixhawk IMU instead of using the D-435i's IMU?

  2. And we also tried to calibrate the extrinsic transformation between the Pixhawk IMU and the D-435i's infrared camera using Kalibr, but the calibration result is not as good as when I used the D435i's IMU. Could you give me some tips for good calibration results?

In the figure below, the figure on the left is the case of calibration using the D-435i's IMU, and the result on the right is the case of using the Pixhawk's IMU.
(In the figures on the right, both the bias estimation and reprojection errors are out of bounds...)

Screenshot from 2021-05-04 19-21-20
Screenshot from 2021-05-04 19-21-04

thank you.

regards.

about the image rate

I have read the related paper your team published on IEEE. Thank you for your great job!
I wonder if you can get a real-time frame when you run vins_fusion on the nx board and I didn't find the related contains on the paper.
I have run the vins_fusion on a nx board with zed2, but there was a severe time delay when I set the output image frame of ZED2 as 15Hz. When I set down the rate to 8Hz, vins_fusion can get real-time images from ZED2.
It it too low for 8hz. If you have encountered a similar problem like this? And how did you solve it?
Thank you!

Method to log the IMU data

Excuse me, I wonder how to get the IMU data from pixhawk with a high frequency(such as 100Hz or more)? Do you use Mavros to connect the px4 with onboard computer?

About Pixhawk 4 mini Calibration

Hi, I'm also trying to use pixhawk 4 mini + d435i for VIO. I want to know whether you calibrate sensors of pixhawk 4 mini in ground station such as QGC or MP before recording datasets. Thank you!

What is orientation.w in the dataset?

Hi,
I am trying to write ekf using your dataset. I had a few queries about your dataset:

  1. Are the values of orientation in IMU section obtained after filtering, for e.g. by Extended Kalman Filter?
  2. Your orientation values in IMU section and in Motion Capture section is not synchronised. Is there a way to synchronise them for use in state estimation process?
  3. What is the orientation.w in both IMU and Motion Capture section of the dataset?

Thanks
Himanshu

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.