Code Monkey home page Code Monkey logo

lidar_snow_sim's Introduction

Created by Martin Hahner at the Computer Vision Lab of ETH Zurich.

Support Ukraine arXiv visitors

PapersWithCode
PapersWithCode
PapersWithCode

🌨 LiDAR Snowfall Simulation
for Robust 3D Object Detection

by Martin Hahner, Christos Sakaridis, Mario Bijelic, Felix Heide, Fisher Yu, Dengxin Dai, and Luc van Gool

πŸ“£ Oral at CVPR 2022.
Please visit our paper website for more details.

Overview

.
β”œβ”€β”€ calib                     # contains the LiDAR sensor calibration file used in STF
β”‚   └── ...
β”œβ”€β”€ lib                       # contains external libraries as submodules
β”‚   └── ...
β”œβ”€β”€ splits                    # contains the splits we used for our experiments
β”‚   └── ...
β”œβ”€β”€ tools                     # contains our snowfall and wet ground simulation code
β”‚   β”œβ”€β”€ snowfall
β”‚   β”‚   β”œβ”€β”€ geometry.py
β”‚   β”‚   β”œβ”€β”€ precompute.py
β”‚   β”‚   β”œβ”€β”€ sampling.py
β”‚   β”‚   └── simulation.py
β”‚   └── wet_ground
β”‚       β”œβ”€β”€ augmentation.py
β”‚       β”œβ”€β”€ phy_equations.py
β”‚       β”œβ”€β”€ planes.py
β”‚       └── utils.py
β”œβ”€β”€ .gitignore
β”œβ”€β”€ .gitmodules
β”œβ”€β”€ LICENSE
β”œβ”€β”€ pointcloud_viewer.py      # to visualize LiDAR point clouds and apply various augmentations
β”œβ”€β”€ README.md
└── teaser.gif

Datasets supported by pointcloud_viewer.py:

Note:
The snowfall and wet ground simulation is only tested on the SeeingThroughFog (STF) dataset.

To support other datasets as well, code changes are required.

License

This software is made available for non-commercial use under a Creative Commons License.
A summary of the license can be found here.

Citation(s)

If you find this work useful, please consider citing our paper.

@inproceedings{HahnerCVPR22,
  author = {Hahner, Martin and Sakaridis, Christos and Bijelic, Mario and Heide, Felix and Yu, Fisher and Dai, Dengxin and Van Gool, Luc},
  title = {{LiDAR Snowfall Simulation for Robust 3D Object Detection}},
  booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
  year = {2022},
}

You may also want to check out our earlier work
Fog Simulation on Real LiDAR Point Clouds for 3D Object Detection in Adverse Weather.

@inproceedings{HahnerICCV21,
  author = {Hahner, Martin and Sakaridis, Christos and Dai, Dengxin and Van Gool, Luc},
  title = {{Fog Simulation on Real LiDAR Point Clouds for 3D Object Detection in Adverse Weather}},
  booktitle = {IEEE International Conference on Computer Vision (ICCV)},
  year = {2021},
}

Getting Started

Setup

  1. Install anaconda.

  2. Execute the following commands.

# Create a new conda environment.
conda create --name snowy_lidar python=3.9 -y

# Activate the newly created conda environment.
conda activate snowy_lidar

# Install dependencies.
conda install matplotlib pandas plyfile pyaml pyopengl pyqt pyqtgraph scipy scikit-learn tqdm -c conda-forge -y
pip install PyMieScatt pyquaternion

# Clone this repository (including submodules!).
git clone [email protected]:SysCV/LiDAR_snow_sim.git --recursive
cd LiDAR_snow_sim
  1. If you want to use our precomputed snowflake patterns, you can download them (2.3GB) as mentioned below.
wget https://www.trace.ethz.ch/publications/2022/lidar_snow_simulation/snowflakes.zip
unzip snowflakes.zip
rm snowflakes.zip
  1. If you want to use DROR as well,
    you need to install PCL or download the point indices (215MB) as mentioned below.
wget https://www.trace.ethz.ch/publications/2022/lidar_snow_simulation/DROR.zip
unzip DROR.zip
rm DROR.zip
  1. Enjoy pointcloud_viewer.py.
python pointcloud_viewer.py
  1. If you also want to run inference on the STF dataset, a couple of extra steps are required.
    Note: For unknown reasons, this can roughly slow down the augmentation(s) by a factor of two.
# Download our checkpoints (265MB)
wget https://www.trace.ethz.ch/publications/2022/lidar_snow_simulation/experiments.zip
unzip experiments.zip
rm experiments.zip

# Install PyTorch.
conda install pytorch==1.10.1 torchvision==0.11.2 torchaudio==0.10.1 cudatoolkit=11.3 -c conda-forge -c pytorch -y

# Install spconv
pip install spconv-cu113

# build pcdet
cd lib/OpenPCDet
python setup.py develop
cd ../..

Disclaimer

The code has been successfully tested on

  • Ubuntu 18.04.6 LTS + CUDA 11.3 + conda 4.13.0
  • Debian GNU/Linux 10 (buster) + conda 4.13.0
  • MacOS Big Sur 11.6.6 + conda 4.13.0

Contributions

Please feel free to suggest improvements to this repository.
We are always open to merge useful pull request.

Acknowledgments

This work is supported by Toyota via the TRACE project.

The work also received funding by the AI-SEE project with national funding from

We also thank the Federal Ministry for Economic Affairs and Energy for support within
VVM-Verification and Validation Methods for Automated Vehicles Level 4 and 5, a PEGASUS family project.

Felix Heide was supported by an NSF CAREER Award (2047359),
a Sony Young Faculty Award, and a Project X Innovation Award.

We thank Emmanouil Sakaridis for verifying our derivation of occlusion angles in our snowfall simulation.

Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β 

lidar_snow_sim's People

Contributors

martinhahner avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

lidar_snow_sim's Issues

How to augment the KITTI bin file?

Thanks for the great work!
The kitti bin only have x,y,z,intensity(not (x, y, z, intensity, channel)), how to augment it with snowfall in your paper?

openpcdet configs to reproduce result

Hey Martin,

Can you please confirm if we can use "dense_dataset_snow_wet_coupled.yaml" to reproduce result for "Your snow+wet" and "dense_dataset_snow_uniform_gunn_1in10.yaml" to reproduce result for "Your snow"?

Does this mean you only used "lidar_hdl64_strongest" (and not hdl 64 last or vlp32 strongest/last) for training and validating all experiments? Also, the train val splits are given in LiDAR_snow_sim/splits folder but you also have FOV3000last, FOV3000strongest info pkls in your OpenPCDet/data/dense repo. While training for snow/fog sim (both papers), did you ignore frames that have less than 3000 points in camera FOV? If you did, can you please point me where in your code you do that or if 3000 points check is done before or after snow/fog sim?

Your reply will be greatly appreciated. Thanks!

Question about Eq. 6

hi Martin,

I have questions regarding Equation 6 in the main paper. I can understand that R^2 comes from the solid angle, transmittance takes into consideration the medium beforehand, I don't understand why it's transmittance squared. Can you elaborate a bit here? Also in the text you mentioned "H C depends on the beam divergence, the overlap of transmitter and receiver described by ΞΎ (R) as well as the transmittance T(R) of the medium through", however, I don't see where does the beam divergence play the role here. I understand beam divergence for two roles: 1. intersect with multiple particles within the cone. 2. adjust the energy that follows Gaussian distribution within the cone. So how does it plays the role in the impulse responses of the optical channel?

Best,
Shengyu

Curiosity of snowflakes npy file

Hello, I have a question of .npy snowflakes file.

I used the snowflake .npy file you provided. I wonder why the number of snowflakes decreases when the value of rainrate is larger.
Shouldn't the number of snowflake particles increase if the rainrate is large?

Thanks.

Error while cloning the repo !

Hi really appreciate the work that you have shared. I was trying to give it a try but getting an error while cloning the repo

[email protected]: Permission denied (publickey).
fatal: Could not read from remote repository.

even doing the sudo clone i am still getting errors related to access rights

Please make sure you have the correct access rights
and the repository exists.
fatal: clone of '[email protected]:MartinHahner/LISA' into submodule path '/home/ammar/lidar_simulator/LiDAR_snow_sim/lib/LISA' failed
Failed to clone 'lib/LISA'. Retry scheduled
Cloning into '/home/ammar/lidar_simulator/LiDAR_snow_sim/lib/LiDAR_fog_sim'...
[email protected]: Permission denied (publickey).
fatal: Could not read from remote repository.

Please make sure you have the correct access rights
and the repository exists.
fatal: clone of '[email protected]:MartinHahner/LiDAR_fog_sim' into submodule path '/home/ammar/lidar_simulator/LiDAR_snow_sim/lib/LiDAR_fog_sim' failed
Failed to clone 'lib/LiDAR_fog_sim'. Retry scheduled
Cloning into '/home/ammar/lidar_simulator/LiDAR_snow_sim/lib/OpenPCDet'...
[email protected]: Permission denied (publickey).
fatal: Could not read from remote repository.

Please make sure you have the correct access rights
and the repository exists.
fatal: clone of '[email protected]:MartinHahner/OpenPCDet' into submodule path '/home/ammar/lidar_simulator/LiDAR_snow_sim/lib/OpenPCDet' failed

Would appreciate if i can get help

The label of snow point cloud

I have a question, in the function β€œprocess_single_channelβ€οΌŒit is understandable that pc[j, 4] = 1 or pc[j, 4] = 2 can be defined as a snowflake point, while pc[j, 4] = 0 is a normal point? right?

And then I find in wads datasets and cadcd datasets, The intensity of most snowflake points is 0(zero), So from your simulation, most snowflake point is not 0(zero). Can you explain my question?

Question about the intensity of snow in STF data

As state in the paper We split the 3916 samples in the snow test set based on the intensity of snowfall into two different subsets, termed light snowfall and heavy snowfall, with 2512 and 1404 samples respectively.

However, I can't find it. Can you opensource its index in the STF?

`beta_0` is reflectivity?

I noticed that in tools/snowfall/simulation.py there is a beta_0. I suppose it to be reflectivity. Why it's fixed to beta_0 = 1 * 10 ** -6 / PI

How to get the output snow+wet pointcloud files?

Hello,thank you for your excellent work, it is exactly what I need.
I have seen the https://github.com/MartinHahner/LiDAR_fog_sim and https://github.com/SysCV/LiDAR_snow_sim. In the fog simulation model, there is a fog_simulation.py to input clear pointcloud files and output fog pointcloud files. While, is there any way to achieve similar functionality in the snow+wet simulation model, or could you please tell me how to achieve this? Hoping for your reply.

What's the use of precompute.py

Thanks for your great work! I wanna know what is the purpose of this file called precompute.py, because I do not see any corresponding note.

The problem of Batch data saving

Excuse me,I'm trying to save the point cloud data after adding rain and snow noise by running the code.
But in the actual code, I can only click on the button to show the point cloud for the next frame one by one.
Could you tell me how to save data in bulk without having to click on it one at a time?
Thank you very much!

PointRCNN on dense

Hey Martin,

I am training pointrcnn on dense dataset clear weather split. However it breaks here.

The problem is that the number of field of view points becomes less than half of 16384 (due to mask_points_and_boxes_outside_range function).
Note: 16384 is the number of points sampled for pointrcnn by "sample_points".

For example:
If after masking points outside range, the len(points) is 5000, then
choice = np.arange(5000)
np.random.choice(, 16384-5000, replace=False) breaks because np.choice cannot sample number of points (i.e. 11384) larger than the number of points in choice (i.e. 5000) without replacing.

How did you manage to overcome this issue during training?

Thank you for the help!

Hello, I can't reproduce the result in PV-RCNN model.

Hello, I used the code provided, but there exists gap between the results I reproduced and the results the paper provided.

This is the result I reproduced in PV-RCNN model.

- test_clear test_snow_light test_snow_heavy
None 45.04 41.20 39.15
Snow 44.60 40.60 39.33
Wet 45.19 40.65 41.07
Snow+Wet 44.85 40.79 41.33

This is the result the paper reported in PV-RCNN model.

- test_clear test_snow_light test_snow_heavy
None 45.36 41.13 39.69
Snow 45.61 41.20 41.61
Wet 44.81 41.07 40.03
Snow+Wet 45.71 41.79 41.79

And I used the model you have provided experiments/snow+wet/pc_rcnn.pth, and the result is:

test_clear test_snow_light test_snow_heavy
42.15 40.12 41.95

Looking forward to your reply!

Dataset

Hello,

Are snow and wet augmentations only available for the DENSE dataset?

scatter points

Hi @MartinHahner

The paper seems to only consider the scatter points (raindrops) generated between the object point and the origin(similar to LISA), but raindrops(scatter points) generated in open spaces are not considered.
So, I would like to know whether raindrop points can be still generated if they are in a very open area?

classes in Seeing Through Fog dataset

Hey Martin,

Which classes from Seeing through fog dataset did you group into the kitti "Car", "Pedestrian", "Cyclist" classes? Your OpenPCDet dense_dataset.py assumes that the groundtruth labels for dense dataset are "Car", "Pedestrian", "Cyclist".

Thanks

TypeError: expected non-empty vector for x

Dear Martin, I met a bug when I was running precompute.py. The reason for the problem is that the variable called ground_distances is empty and the error happens on this file called 2018-02-09_19-10-30_00000.bin. I do not know how to solve this problem because I haven't changed your other code except the corresponding path. Please give me some advice.

gunn: 63%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Ž | 2194/3469 [00:47<00:27, 46.29it/s] Traceback (most recent call last): File "C:/Users/Roland/Desktop/LiDAR_snow_sim-main/tools/snowfall/precompute.py", line 107, in <module> beam_divergence=float(np.degrees(3e-3))) File "C:\Users\Roland\Desktop\LiDAR_snow_sim-main\tools\snowfall\simulation.py", line 466, in augment p = np.polyfit(ground_distances, adaptive_noise_threshold, 2) File "<__array_function__ internals>", line 6, in polyfit File "C:\Users\Roland\miniconda3\envs\point_rcnn\lib\site-packages\numpy\lib\polynomial.py", line 630, in polyfit raise TypeError("expected non-empty vector for x") TypeError: expected non-empty vector for x

About STF

Hi, Martin!
Thank you so much for your work. I'm confused about the training process. I am debugging at the dense_data.py using the openpcdet in the project. And there seems something wrong with the function 'def get_infos', so I am wondering how to arrrange the STF dataset under' openpcdet. data. dense' just using hdl64&strongest.

Thank you for your reply!

Training code

Thank you for your excellent paper in this field. Will the training code in the paper be made public? If it does, it will help us greatly in our scientific research.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.