Code Monkey home page Code Monkey logo

dynablox's Introduction

Ubuntu 20.04 + ROS Noetic: Build

Dynablox

An online volumetric mapping-based approach for real-time detection of diverse dynamic objects in complex environments.

Table of Contents

Credits

Setup

Examples

Paper

If you find this package useful for your research, please consider citing our paper:

  • Lukas Schmid, Olov Andersson, Aurelio Sulser, Patrick Pfreundschuh, and Roland Siegwart. "Dynablox: Real-time Detection of Diverse Dynamic Objects in Complex Environments" in IEEE Robotics and Automation Letters (RA-L), Vol. 8, No. 10, pp. 6259 - 6266, October 2023. [ IEEE | ArXiv | Video ]
    @article{schmid2023dynablox,
      title={Dynablox: Real-time Detection of Diverse Dynamic Objects in Complex Environments},
      author={Schmid, Lukas, and Andersson, Olov, and Sulser, Aurelio, and Pfreundschuh, Patrick, and Siegwart, Roland},
      booktitle={IEEE Robotics and Automation Letters (RA-L)},
      year={2023},
      volume={8},
      number={10},
      pages={6259 - 6266},
      doi={10.1109/LRA.2023.3305239}}
    }

Video

A brief overview of the problem, approach, and results is available on youtube: Dynablox Youtube Video

News

We were excited to learn that Dynablox has been integrated into NVIDIA's nvblox, where the algorithm's parallelism can make fantastic use of the GPU and detect moving objects fast and at high resolutions!

Setup

There is a docker image available for this package. Check the usage in the dockerhub page.

Installation

  • Note on Versioning: This package was developed using Ubuntu 20.04 using ROS Noetic. Other versions should also work but support can not be guaranteed.
  1. If not already done so, install ROS. We recommend using Desktop-Full.

  2. If not already done so, setup a catkin workspace:

    mkdir -p ~/catkin_ws/src
    cd ~/catkin_ws
    catkin init
    catkin config --extend /opt/ros/$ROS_DISTRO
    catkin config --cmake-args -DCMAKE_BUILD_TYPE=RelWithDebInfo
    catkin config --merge-devel
  3. Install system dependencies:

    sudo apt-get install python3-vcstool python3-catkin-tools ros-$ROS_DISTRO-cmake-modules protobuf-compiler autoconf git rsync -y   
  4. Clone the repo using SSH Keys:

    cd ~/catkin_ws/src
    git clone [email protected]:ethz-asl/dynablox.git
  5. Install ROS dependencies:

    cd ~/catkin_ws/src
    vcs import . < ./dynablox/ssh.rosinstall --recursive 
  6. Build:

    catkin build dynablox_ros

Datasets

To run the demos we use the Urban Dynamic Objects LiDAR (DOALS) Dataset. To download the data and pre-process it for our demos, use the provided script:

roscd dynablox_ros/scripts
./download_doals_data.sh /home/$USER/data/DOALS # Or your preferred data destination.

We further collect a new dataset featuring diverse dynamic objects in complex scenes. The full dataset and description ca nbe found here. To download the processed ready-to-run data for our demos, use the provided script:

roscd dynablox_ros/scripts
./download_dynablox_data.sh /home/$USER/data/Dynablox # Or your preferred data destination.

Examples

Running a DOALS Sequence

  1. If not done so, download the DOALS dataset as explained here.

  2. Adjust the dataset path in dynablox_ros/launch/run_experiment.launch:

    <arg name="bag_file" default="/home/$(env USER)/data/DOALS/hauptgebaeude/sequence_1/bag.bag" />  
  3. Run

    roslaunch dynablox_ros run_experiment.launch 
  4. You should now see dynamic objects being detected as the sensor moves through the scene:

Run DOALS Example

Running a Dynablox Sequence

  1. If not done so, download the Dynablox dataset as explained here.

  2. Adjust the dataset path in dynablox_ros/launch/run_experiment.launch and set use_doals to false:

    <arg name="use_doals" default="false" /> 
    <arg name="bag_file" default="/home/$(env USER)/data/Dynablox/processed/ramp_1.bag" />  
  3. Run

    roslaunch dynablox_ros run_experiment.launch 
  4. You should now see dynamic objects being detected as the sensor moves through the scene: Run Dynablox Example

Running and Evaluating an Experiment

Running an Experiment

  1. If not done so, download the DOALS dataset as explained here.

  2. Adjust the dataset path in dynablox_ros/launch/run_experiment.launch:

    <arg name="bag_file" default="/home/$(env USER)/data/DOALS/hauptgebaeude/sequence_1/bag.bag" />  
  3. In dynablox_ros/launch/run_experiment.launch, set the evaluate flag, adjust the ground truth data path, and specify where to store the generated outpuit data:

    <arg name="evaluate" default="true" />
    <arg name="eval_output_path" default="/home/$(env USER)/dynablox_output/" />
    <arg name="ground_truth_file" default="/home/$(env USER)/data/DOALS/hauptgebaeude/sequence_1/indices.csv" />
  4. Run

    roslaunch dynablox_ros run_experiment.launch 
  5. Wait till the dataset finished processing. Dynablox should shutdown automatically afterwards.

Analyzing the Data

  • Printing the Detection Performance Metrics:

    1. Run:
    roscd dynablox_ros/src/evaluation
    python3 evaluate_data.py /home/$USER/dynablox_output
    1. You should now see the performance statistics for all experiments in that folder:
    1/1 data entries are complete.
    Data                     object_IoU               object_Precision              object_Recall
    hauptgebaeude_1          89.8 +- 5.6              99.3 +- 0.4                   90.3 +- 5.6
    All                      89.8 +- 5.6              99.3 +- 0.4                   90.3 +- 5.6
    
  • Inspecting the Segmentation:

    1. Run:
    roslaunch dynablox_ros cloud_visualizer.launch file_path:=/home/$USER/dynablox_output/clouds.csv
    1. You should now see the segmentation for the annotated ground truth clouds, showing True Positives (green), True Negatives (black), False Positives (blue), False Negatives (red), and out-of-range (gray) points: Evaluation
  • Inspecting the Run-time and Configuration: Additional information is automatically stored in timings.txt and config.txt for each experiment.

Advanced Options

  • Adding Drift to an Experiment: To run an experiment with drift specify one of the pre-computed drift rollouts in dynablox_ros/launch/run_experiment.launch:

    <arg name="drift_simulation_rollout" default="doals/hauptgebaeude/sequence_1/light_3.csv" />

    All pre-computed rollouts can be found in drift_simulation/config/rollouts. Note that the specified sequence needs to match the data being played. For each sequence, there exist 3 rollouts for each intensity.

    Alternatively, use the drift_simulation/launch/generate_drift_rollout.launch to create new rollouts for other datasets.

  • Changing th Configuration of Dynablox: All parameters that exist in dynablox are listed in dynablox_ros/config/motion_detector/default.yaml, feel free to tune the method for your use case!

dynablox's People

Contributors

kin-zhang avatar schmluk avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dynablox's Issues

Detections are stucking on rviz

Thank you for open source this great framework.

I am running the package in the docker environment. My computer is an Intel® Core™ i9-10900X CPU @ 3.70GHz × 20I, with a NVIDIA Corporation TU102 [GeForce RTX 2080 Ti], and 128.0 GiB of RAM.

I have downloaded hauptgebaeude_sequence_1 in DOASL for testing. The simulations on Rviz are slow and stuck.

dynablox_stucking_docker.mp4

I would be grateful for any help!

Datasets Link

Thank you so much for your outstanding work.
I would like to ask, regarding the part of the dataset, can you provide a download link similar to google drive? Because from the original address, the download speed is about 50k/s. Looking forward to your reply.

About compiling

Excellent job! Unfortunately, I encountered some issues while compiling, as shown below:

Errors << dynablox_ros:check /home/x/Dynablox/logs/dynablox_ros/build.check.000.log
CMake Error at /home/x/Dynablox/devel/share/catkin_simple/cmake/catkin_simple-extras.cmake:38 (find_package):
By not providing "Findlidar_undistortion.cmake" in CMAKE_MODULE_PATH this
project has asked CMake to find a package configuration file provided by
"lidar_undistortion", but CMake did not find one.

Could not find a package configuration file provided by
"lidar_undistortion" with any of the following names:

lidar_undistortionConfig.cmake
lidar_undistortion-config.cmake

Add the installation prefix of "lidar_undistortion" to CMAKE_PREFIX_PATH or
set "lidar_undistortion_DIR" to a directory containing one of the above
files. If "lidar_undistortion" provides a separate development package or
SDK, be sure it has been installed.
Call Stack (most recent call first):
CMakeLists.txt:5 (catkin_simple)

make: *** [cmake_check_build_system] Error 1
cd /home/x/Dynablox/build/dynablox_ros; catkin build --get-env dynablox_ros | catkin env -si /usr/bin/make cmake_check_build_system; cd -
......................................................................................................
Failed << dynablox_ros:check [ Exited with code 2 ]
Failed <<< dynablox_ros [ 0.3 seconds ]
[build] Summary: 17 of 18 packages succeeded.
[build] Ignored: 4 packages were skipped or are blacklisted.
[build] Warnings: 1 packages succeeded with warnings.
[build] Abandoned: None.
[build] Failed: 1 packages failed.
[build] Runtime: 1 minute and 8.7 seconds total.

I am sure that the installed version is dynamiblox/release. ( https://github.com/ethz-asl/lidar_undistortion/tree/dynablox/release)

I would be grateful for any help!

Problem about doals dataset

The doals dataset provides a point cloud as well as an absolute bitmap (/T_map_os1_lidar), and we extracted the point cloud and interpolated it to get the pose where the point cloud frames are located. However, the maps obtained from the registration have obvious misalignment, which means that there is a large error in the pose. Can you please describe the processing method for this dataset?

voxel_map[voxel_index].push_back(i);

Hi , I tried to build the pkg but i get this error
Errors << catkin_lidar/logs/dynablox_ros/build.make.001.log
catkin_lidar/src/dynablox/dynablox_ros/src/motion_detector.cpp: In member function ‘void dynablox::MotionDetector::blockwiseBuildPointMap(const Cloud&, const BlockIndex&, voxblox::AlignedVector&, dynablox::VoxelToPointMap&, std::vector<std::pair<Eigen::Matrix<int, 3, 1>, Eigen::Matrix<int, 3, 1> > >&, dynablox::CloudInfo&) const’:
/home/ideajoyan/catkin_lidar/src/dynablox/dynablox_ros/src/motion_detector.cpp:310:39: error: no matching function for call to ‘std::vector<Eigen::Matrix<int, 3, 1>, Eigen::aligned_allocator<Eigen::Matrix<int, 3, 1> > >::push_back(size_t&)’
310 | voxel_map[voxel_index].push_back(i);
| ^
In file included from /usr/include/c++/9/vector:67,
from /home/catkin_lidar/src/dynablox/dynablox_ros/include/dynablox_ros/motion_detector.h:9,
from /home/catkin_lidar/src/dynablox/dynablox_ros/src/motion_detector.cpp:1:
/usr/include/c++/9/bits/stl_vector.h:1184:7: note: candidate: ‘void std::vector<_Tp, _Alloc>::push_back(const value_type&) [with _Tp = Eigen::Matrix<int, 3, 1>; _Alloc = Eigen::aligned_allocator<Eigen::Matrix<int, 3, 1> >; std::vector<_Tp, _Alloc>::value_type = Eigen::Matrix<int, 3, 1>]’
1184 | push_back(const value_type& __x)
| ^~~~~~~~~
/usr/include/c++/9/bits/stl_vector.h:1184:35: note: no known conversion for argument 1 from ‘size_t’ {aka ‘long unsigned int’} to ‘const value_type&’ {aka ‘const Eigen::Matrix<int, 3, 1>&’}
1184 | push_back(const value_type& __x)
| ~~~~~~~~~~~~~~~~~~^~~
/usr/include/c++/9/bits/stl_vector.h:1200:7: note: candidate: ‘void std::vector<_Tp, _Alloc>::push_back(std::vector<_Tp, _Alloc>::value_type&&) [with _Tp = Eigen::Matrix<int, 3, 1>; _Alloc = Eigen::aligned_allocator<Eigen::Matrix<int, 3, 1> >; std::vector<_Tp, _Alloc>::value_type = Eigen::Matrix<int, 3, 1>]’
1200 | push_back(value_type&& __x)
| ^~~~~~~~~
/usr/include/c++/9/bits/stl_vector.h:1200:30: note: no known conversion for argument 1 from ‘size_t’ {aka ‘long unsigned int’} to ‘std::vector<Eigen::Matrix<int, 3, 1>, Eigen::aligned_allocator<Eigen::Matrix<int, 3, 1> > >::value_type&&’ {aka ‘Eigen::Matrix<int, 3, 1>&&’}
1200 | push_back(value_type&& __x)
| ~~~~~~~~~~~~~^~~
make[2]: *** [CMakeFiles/dynablox_ros.dir/build.make:89: CMakeFiles/dynablox_ros.dir/src/motion_detector.cpp.o] Error 1
make[2]: *** Waiting for unfinished jobs....
make[1]: *** [CMakeFiles/Makefile2:1252: CMakeFiles/dynablox_ros.dir/all] Error 2
make: *** [Makefile:141: all] Error 2>

Questions

Thanks for the great work and sharing.

I have a couple of questions

  1. outdoor results or videos are available?
  2. any tips to tune parameters for outdoor (i.e., more sparse and fast dynamic objects, even more ego pose errors) datasets?

Program Crash in official dataset with seq0 and seq1

Information:

  • ROS noetic
  • Running after 1 mins will have the crash:
  • Default config and code
  • Dataset: hauptgebaeude/sequence_1 and hauptgebaeude/sequence_2

Crash message:

motion_detector: /usr/include/eigen3/Eigen/src/Core/DenseCoeffsBase.h:117: Eigen::DenseCoeffsBase<Derived, 0>::CoeffReturnType Eigen::DenseCoeffsBase<Derived, 0>::operator()(Eigen::Index, Eigen::Index) const [with Derived = Eigen::Matrix<float, -1, -1>; Eigen::DenseCoeffsBase<Derived, 0>::CoeffReturnType = const float&; Eigen::Index = long int]: Assertion `row >= 0 && row < rows() && col >= 0 && col < cols()' failed.
*** Aborted at 1683150283 (unix time) try "date -d @1683150283" if you are using GNU date ***
PC: @     0x7f161673900b gsignal
*** SIGABRT (@0x7740) received by PID 30528 (TID 0x7f15dbfff700) from PID 30528; stack trace: ***
    @     0x7f161707a781 google::(anonymous namespace)::FailureSignalHandler()
    @     0x7f1616739090 (unknown)
    @     0x7f161673900b gsignal
    @     0x7f1616718859 abort
    @     0x7f1616718729 (unknown)
    @     0x7f1616729fd6 __assert_fail
    @     0x7f161582d851 Eigen::DenseCoeffsBase<>::operator()()
    @     0x7f161582cd8c voxblox::ProjectiveTsdfIntegrator<>::interpolate()
    @     0x7f16158407ed voxblox::ProjectiveTsdfIntegrator<>::updateTsdfVoxel()
    @     0x7f161583f3de voxblox::ProjectiveTsdfIntegrator<>::updateTsdfBlocks()
    @     0x7f161584683c _ZSt13__invoke_implIvMN7voxblox24ProjectiveTsdfIntegratorILNS0_19InterpolationSchemeE3EEEFvRKN5kindr7minimal26QuatTransformationTemplateIfEERKN5Eigen6MatrixIfLin1ELin1ELi0ELin1ELin1EEERKSt13unordered_setINSB_IiLi3ELi1ELi0ELi3ELi1EEENS0_12AnyIndexHashESt8equal_toISG_ENSA_17aligned_allocatorISG_EEEbEPS3_JS7_SC_SM_bEET_St21__invoke_memfun_derefOT0_OT1_DpOT2_
    @     0x7f161584664f _ZSt8__invokeIMN7voxblox24ProjectiveTsdfIntegratorILNS0_19InterpolationSchemeE3EEEFvRKN5kindr7minimal26QuatTransformationTemplateIfEERKN5Eigen6MatrixIfLin1ELin1ELi0ELin1ELin1EEERKSt13unordered_setINSB_IiLi3ELi1ELi0ELi3ELi1EEENS0_12AnyIndexHashESt8equal_toISG_ENSA_17aligned_allocatorISG_EEEbEJPS3_S7_SC_SM_bEENSt15__invoke_resultIT_JDpT0_EE4typeEOST_DpOSU_
    @     0x7f161584649f _ZNSt6thread8_InvokerISt5tupleIJMN7voxblox24ProjectiveTsdfIntegratorILNS2_19InterpolationSchemeE3EEEFvRKN5kindr7minimal26QuatTransformationTemplateIfEERKN5Eigen6MatrixIfLin1ELin1ELi0ELin1ELin1EEERKSt13unordered_setINSD_IiLi3ELi1ELi0ELi3ELi1EEENS2_12AnyIndexHashESt8equal_toISI_ENSC_17aligned_allocatorISI_EEEbEPS5_S9_SE_SO_bEEE9_M_invokeIJLm0ELm1ELm2ELm3ELm4ELm5EEEEvSt12_Index_tupleIJXspT_EEE
    @     0x7f16158463d1 _ZNSt6thread8_InvokerISt5tupleIJMN7voxblox24ProjectiveTsdfIntegratorILNS2_19InterpolationSchemeE3EEEFvRKN5kindr7minimal26QuatTransformationTemplateIfEERKN5Eigen6MatrixIfLin1ELin1ELi0ELin1ELin1EEERKSt13unordered_setINSD_IiLi3ELi1ELi0ELi3ELi1EEENS2_12AnyIndexHashESt8equal_toISI_ENSC_17aligned_allocatorISI_EEEbEPS5_S9_SE_SO_bEEEclEv
    @     0x7f16158463a2 _ZNSt6thread11_State_implINS_8_InvokerISt5tupleIJMN7voxblox24ProjectiveTsdfIntegratorILNS3_19InterpolationSchemeE3EEEFvRKN5kindr7minimal26QuatTransformationTemplateIfEERKN5Eigen6MatrixIfLin1ELin1ELi0ELin1ELin1EEERKSt13unordered_setINSE_IiLi3ELi1ELi0ELi3ELi1EEENS3_12AnyIndexHashESt8equal_toISJ_ENSD_17aligned_allocatorISJ_EEEbEPS6_SA_SF_SP_bEEEEE6_M_runEv
    @     0x7f16169d9de4 (unknown)
    @     0x7f16166db609 start_thread
    @     0x7f1616815133 clone

errors encountered when compiling

image
When compiling ouster_ros, I encountered a problem about spdlog. I'd appreciate it a lot if you could give me some advice on sloving it !

questions about real-time adeptation

Thank you for the great code.
I am currently working with the code and have a few questions.
Is it possible to receive Velodyne points or MID-360 point clouds in real-time and process them using the package?

Also, I noticed in the issues that you mentioned a "localized point cloud."
Does the term "localized point cloud" refer to the point cloud and state that come from other point cloud-based SLAM systems?

thanks for reading

grpc complie ??

When compiling grpc package is known to be stuck at 12%,what can i do ?
fd9800dda98cc7b52ddfe041589992f
fd9800dda98cc7b52ddfe041589992f

Not identifying dynamic obstacles

I am having problems to identify dynamic obstacles. I am using the default config file. Dynablox is identifying the static obstacles very well and in real time but the moving obstacle in the environment is also being classified as static. In the video is possible to see the static obstacle (labeled by dynablox) moving.

dynablox_not_identify_dym.mp4

Do you know why is it happening? In the bag files, dynablox is identifying the dynamic obstacles.

The frequencies are:
pointcloud: 50Hz
tf: 700Hz
static obstacles (/motion_detector/visualization/detections/object/static): 33Hz

Thank you very much for your attention

dynablox my own dataset

Hi,

thanks for your fabulous work about dynamic object detection!

I want to run dynablox with my own dataset. Would you like to give me some advices about how to set up?

Appreciate for your help!

DOALS LiDAR undistortion

Thank you very much for this well documented work and source code.

I have a question concerning the lidar undistortion for the DOALS sequences. I decided to ask this question here instead of lidar_undistortion, though, in hope of a faster answer :)
Maybe I don't understand correctly what the undistortion node is supposed to do. Looking at the cloud that it outputs (topic "/pointcloud"), there are still distortion artefacts visible at the beginning/end of each scan line, see image.
doals_distortion

Am I doing something wrong or is it supposed to look like this? I strictly followed the commands from the Readme running run_experiment.launch.
The reason I am asking is that I want to project the pointcloud onto a spherical image which is hindered by the distortion.

Thanks!

real time run?

Can this package run in real time on the cloud topic? Which sensors are needed?

The detection results on rviz are very stuck

Thank you for open sourcing this excellent work!
When I use the Velodyne16 lidar data in gazebo to test the algorithm, the detection results on rviz are very stuck, as shown in the video below:

gazebo1.mp4

In addition, the terminal keeps printing (I ignore it for now):

Failed to find match for field 't'.
Failed to find match for field 'reflectivity'.
Failed to find match for field 'ambient'.
Failed to find match for field 'range'.

Some changes are as follows:

  1. Modify the lidar hardware parameters in "default.yaml":

sensor_horizontal_resolution: 2048

sensor_vertical_resolution: 64

sensor_vertical_field_of_view_degrees: 33.22222

sensor_horizontal_resolution: 720
sensor_vertical_resolution: 16
sensor_vertical_field_of_view_degrees: 30

  1. Modified the lidar topic and tf in "play_doals_data.launch".
 <!-- <remap from="pointcloud" to="/os1_cloud_node/points" />
 <remap from="~pointcloud_corrected" to="/pointcloud" />
 <param name="odom_frame_id" value="map" />
 <param name="lidar_frame_id" value="os1_lidar" /> -->

 <remap from="pointcloud" to="/velodyne_points" />
 <remap from="~pointcloud_corrected" to="/pointcloud" />
 <param name="odom_frame_id" value="odom" />
 <param name="lidar_frame_id" value="velodyne" />

What else do I need to modify? I would be grateful for any help!

grpc complie ??

When compiling grpc package is known to be stuck at 12%,what can i do ?
Uploading fd9800dda98cc7b52ddfe041589992f.png…

How can Dynablox be used for motion planning?

Hello, great job! As we know, Voxblox is used not only for reconstructing 3D meshes but also for generating ESDF maps for robot motion planning. My question is: How can Dynablox be used for motion planning?

The planning example for Voxblox mentioned here https://voxblox.readthedocs.io/en/latest/pages/Using-Voxblox-for-Planning.html states that you should use esdf_server node and subscribe to the incremental mapping data in the planner node. However, in Dynablox, it uses tsdf_server instead. I am wondering if there is an easy method to generate an ESDF map while also considering dynamic obstacle detection using Dynablox?

I sincerely seek your guidance and hope to receive your kind advice.

is there any way to use clustering the dynamic points alone?

thanks for the really nice repo and code.
i have tried dataset and seen the result of the clustering.
however as i am using jetson orin 64gb developer kits. the fps of the clustering was bit slow.

is there a reason? and if it is possible
the thing that i wanna accomplishing is using the clustering algorithms alone with fastlio2
not making the TSDF mapping. is it possible?

thanks for the reading and have a great day!
(and i just figured out that dynablox is included in the nvblox, i will try that one and tell you how it works.
again have a great day!)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.