Code Monkey home page Code Monkey logo

rappids's Introduction

Rectangular Pyramid Partitioning using Integrated Depth Sensors (RAPPIDS)

This repository contains source code implementing an algorithm for quickly finding local collision-free trajectories given a single depth image from an onboard camera. The algorithm leverages a new pyramid-based spatial partitioning method that enables rapid collision detection between candidate trajectories and the environment. Due to its efficiency, the algorithm can be run at high rates on computationally constrained hardware, evaluating thousands of candidate trajectories in milliseconds.

The algorithm is described in a paper submitted to IEEE Robotics and Automation Letters (RA-L) with the International Conference on Intelligent Robots and Systems 2020 (IROS) option. A preprint version of the paper is available here.

Contact: Nathan Bucki ([email protected]) High Performance Robotics Lab, Dept. of Mechanical Engineering, UC Berkeley

Getting Started

First clone the repository and enter the created folder:

git clone https://github.com/nlbucki/RAPPIDS.git
cd RAPPIDS

Create a build folder and compile:

mkdir build
cd build
cmake ..
make

A program is provided that demonstrates the performance of the algorithm and gives an example of how the algorithm can be used to generate collision free motion primitives. The program Benchmarker performs the Monte Carlo simulations described in Seciton IV of the associated paper. The three tests performed in the paper can be ran from the RAPPIDS folder with the following commands:

./build/test/Benchmarker --test_type 0 -n 10000 --maxNumPyramidForConservativenessTest 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
./build/test/Benchmarker --test_type 1 -n 10000 
./build/test/Benchmarker --test_type 2 -n 10000 --w 640 --h 480 --f 386 --cx 320 --cy 240 --numCompTimesForTCTest 20

Note the -n option can be changed to a smaller number to perform less Monte Carlo trials, and thus run the tests faster (but with less accuracy). The above settings reflect those used to generate the results reported in the paper.

Each test generates a .json file in the data folder containing the test results. We provide two python scripts to visualize the results of the conservativeness test (test 0) and the overall planner performance test (test 2). They can be ran with the following commands:

cd scripts
python plotAvgTrajGenNum.py
python plotConservativeness.py

Documentation

An HTML file generated with Doxygen can be accessed after cloning the repository by opening Documentation.html in the doc/ folder.

Licensing

This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.

You should have received a copy of the GNU General Public License along with this program. If not, see http://www.gnu.org/licenses/.

rappids's People

Contributors

nlbucki avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

rappids's Issues

Questions about sampling trajectory to controller

Hi ! Thank you for your great work. I notice that in the trajectory generation part. you set the initial position, velocity and accelerate to 0;


    int GetNextCandidateTrajectory(
        RapidQuadrocopterTrajectoryGenerator::RapidTrajectoryGenerator& nextTraj) {
      CommonMath::Vec3 posf;
      _planner->DeprojectPixelToPoint(_pixelX(_gen), _pixelY(_gen),
                                      _depth(_gen), posf);
      nextTraj.Reset();
      nextTraj.SetGoalPosition(posf);
      nextTraj.SetGoalVelocity(CommonMath::Vec3(0, 0, 0));
      nextTraj.SetGoalAcceleration(CommonMath::Vec3(0, 0, 0));
      nextTraj.Generate(_time(_gen));
      return 0;
    }

to my understanding the sampled reference point from trajectory is sent to controller in the world frame. So I have 3 questions as below:

  1. Is the initial reference point sampled from trajectory being the same with the drone's current state(p, v, a) or it is (0,0,0)?
  2. if the first sampled point of the trajectory being the drone's current state, and the drone is replanning fast(frequently), we notice that the drone is going to diverge, could you help how to solve this?

we solved the above question by init the new trajectory with the current sampled point like below

    RapidTrajectoryGenerator rappid_traj_temp = RapidTrajectoryGenerator(
        Vec3(last_sample_p_in_camera_(0), last_sample_p_in_camera_(1), last_sample_p_in_camera_(2)),
        Vec3(last_sample_v_in_camera_(0), last_sample_v_in_camera_(1), last_sample_v_in_camera_(2)),
        Vec3(last_sample_a_in_camera_(0), last_sample_a_in_camera_(1), last_sample_a_in_camera_(2)),
        gravity_in_camera_);

however, i noticed that when replanning happens, if the first sampled point of new trajectory is the same with the current sampled point(like the code above). the Closed form condition in RAPPIDS(traj(t) = ct^5 + ct^4 + ct^3 + ct^2+ ct + c intersect with a plane knowing one intersection point has a closed form of other intersection point) does not meet, how to solve this problem?

thank you again!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.