Code Monkey home page Code Monkey logo

lidar_localization_ros2's People

Contributors

naokitakahashi12 avatar rsasaki0109 avatar tzu-lin-huang-128 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

lidar_localization_ros2's Issues

is adjustDistortion very important for localization?

Hi, thanks for your great job!

I have a question, is the adjustDistortion very important for an outdoor robot locolization? My robot currently has no imu and I wander what is the influence or improvement for this function. It woule be better if some experimental comparison data could be provided.

Thanks for your attention and I am always looking forward to your response.

map data

hello, could you share map data? bin_tc-2017-10-15-ndmap.pcd

Use diffirent sensor like Ouster or etc

Hi rsasaki, I have a problem. Localization is fine when ı used hdl_400 bag file. But there is a problem when ı use ouster lidar and ouster bag file. I have same issue with lidarslam_ros2 but ı have solved this issue by changing "base_link" to "os_lidar" (inside params) , and "velodyne" to "os_lidar", "velodyne_points" to "ouster/points".
with lidar_localizaiton, ı make the same change but didnt work this time. This is the general information about my problem.
I want to ask what kind of change that ı should when ı use different sensor like ouster.
ROS2 Humble and ubuntu 22.04

Cannot build in humble of 22.04

pcl_localization_component.cpp:4:20: error: no matching function for call to ‘tf2_ros::Buffer::Buffer()’ 4 | broadcaster_(this) | ^ In file included from /home/terra/ros2_ws/src/pcl_localization_ros2/include/pcl_localization/pcl_localization_component.hpp:13, from /home/terra/ros2_ws/src/pcl_localization_ros2/src/pcl_localization_component.cpp:1: /opt/ros/humble/include/tf2_ros/tf2_ros/buffer.h:73:18: note: candidate: ‘tf2_ros::Buffer::Buffer(rclcpp::Clock::SharedPtr, tf2::Duration, rclcpp::Node::SharedPtr)’ 73 | TF2_ROS_PUBLIC Buffer( | ^~~~~~ /opt/ros/humble/include/tf2_ros/tf2_ros/buffer.h:73:18: note: candidate expects 3 arguments, 0 provided /home/terra/ros2_ws/src/pcl_localization_ros2/src/pcl_localization_component.cpp:4:20: error: no matching function for call to ‘tf2_ros::TransformListener::TransformListener()’ 4 | broadcaster_(this) | ^ In file included from /home/terra/ros2_ws/src/pcl_localization_ros2/include/pcl_localization/pcl_localization_component.hpp:14, from /home/terra/ros2_ws/src/pcl_localization_ros2/src/pcl_localization_component.cpp:1: /opt/ros/humble/include/tf2_ros/tf2_ros/transform_listener.h:90:3: note: candidate: ‘template<class NodeT, class AllocatorT> tf2_ros::TransformListener::TransformListener(tf2::BufferCore&, NodeT&&, bool, const rclcpp::QoS&, const rclcpp::QoS&, const rclcpp::SubscriptionOptionsWithAllocator<AllocatorT>&, const rclcpp::SubscriptionOptionsWithAllocator<AllocatorT>&)’ 90 | TransformListener( | ^~~~~~~~~~~~~~~~~ /opt/ros/humble/include/tf2_ros/tf2_ros/transform_listener.h:90:3: note: template argument deduction/substitution failed: /home/terra/ros2_ws/src/pcl_localization_ros2/src/pcl_localization_component.cpp:4:20: note: candidate expects 7 arguments, 0 provided 4 | broadcaster_(this) | ^

how can i solve it?

pose drift largely

Screenshot 2023-03-03 15:29:01
I tested the localization algorithm in bag, I found the pose drift very large, I wonder whether my setup was not correct? if I set the param of "set_initial_pose" to false, the path will disappear, so I have to set it to true.

The agent cannot turn left or right durning rosbag replay

Hi there. When I ros2 bag play my_dataset.db3, I found the pose cannot turn left or right and draft largely. Do you know why the drift happen? The video recording the draft shows the problem, you can see the red pose vector never change.

20-04-23.18.05.55.webm

Pose with Covariance?

Hi, are there plans to calculate the covariance matrices for the pose output? Would be really helpful for fusing the output with other sensors. Thanks!

Using odom

Hi Ryohei, amazing work on the repo

I got pretty much everything to work robustly with just LIDAR. May I know what is the best way to integrate my own odometry source and the parameters to tune? I set the use_odom to True but it seemed like scan matching is not working

Handling angular velocities

Hi, thanks for releasing the great package. I've been tinkering with this and the results have been promising so far. However, I realized that the robot gets lost when turning on higher angular velocity. Just wondering if there's any parameter you can recommend to tweak to resolve this issue? Thank you!

about ros2 humble

hi rsasaki, I'm getting a lot of help from you
while attempting to use, The part of launch file, remapping ouster/points to /cloud. because I want to use ouster. but It didn't work. it can't get point cloud. so I found code that subcribe pointcloud in pcl_localization_component.cpp. In that code, I can find "velodyne_points". so I edit launch file. change /cloud to /velodyne_points. after edit. It work. but it can't see work well. It's very inaccurate. please help me.
Untitled
Untitled (1)
Untitled (2)

Use odom tf instead of odom topic

Hey,
Instead of subscribing to an odometry topic, you could just transform the lidar point cloud to the odom frame.

This would allow your localization package to fix the error that odometry makes. Of course, the path you publish doesn't make sense anymore because it doesn't take odometry into account. But the combined transformation (map->odom->base_link) would be a more accurate pose of the robot.

I tested this with kiss-icp (a lidar odometry pipeline) and got a really good localization out of it.

This is the change I made to your code to make it work.

So in your cloudReceived callback you could add the transform after you filter out points that are too close or too far.

double r;
  pcl::PointCloud<pcl::PointXYZI> tmp;
  for (const auto & p : filtered_cloud_ptr->points) {
    r = sqrt(pow(p.x, 2.0) + pow(p.y, 2.0));
    if (scan_min_range_ < r && r < scan_max_range_) {
      tmp.push_back(p);
    }
  }
  // Transform the filtered and pruned point cloud into the odom frame.
  sensor_msgs::msg::PointCloud2 pruned_cloud_ros;
  pcl::toROSMsg(tmp, pruned_cloud_ros);
  sensor_msgs::msg::PointCloud2 transformed_cloud;
  try {
    // The lookupTransform arguments are target frame, source frame, and time.
    pruned_cloud_ros.header.frame_id = msg->header.frame_id; //Set the header of the pointcloud to the original frame
    tfbuffer_.transform(pruned_cloud_ros, transformed_cloud, "odom", tf2::durationFromSec(0.1)); //TODO get odom frame from params
  } catch (tf2::TransformException &ex) {
    RCLCPP_WARN(get_logger(), "%s", ex.what());
    return;
  }

  pcl::PointCloud<pcl::PointXYZI>::Ptr transformed_cloud_ptr(new pcl::PointCloud<pcl::PointXYZI>());
  pcl::fromROSMsg(transformed_cloud, *transformed_cloud_ptr);

Additionally I had to remove the static odom->velodyne frame from you launchfile.

Unable to compile on Galactic

Hi, attempting colcon build --symlink-install returns:

pcl_localization_ros2/src/pcl_localization_component.cpp: In member function ‘virtual CallbackReturn PCLLocalization::on_activate(const rclcpp_lifecycle::State&)’:
pcl_localization_ros2/src/pcl_localization_component.cpp:80:42: error: no matching function for call to ‘rclcpp_lifecycle::LifecyclePublisher<sensor_msgs::msg::PointCloud2_<std::allocator<void> > >::publish(sensor_msgs::msg::PointCloud2_<std::allocator<void> >::SharedPtr&)’
   80 |     initial_map_pub_->publish(map_msg_ptr);

typo in parameter name

Hello Ryohei

The parameter transform_epsilon should be trans_epsilon in the parameter file(localization.yaml) as there is no such parameter as transform_epsilon in the code.

IMU Calibration

Thanks for your valuable work!

Currently, I am using the lidar undistortion function and embedding it into hdl_localization. However, when I turned IMU on and played my bag, the localization failed. I'm wondering if it's a problem with IMU calibration, have you done the calibration in the function (I didn't find the interface for setting the IMU parameters)?

IMU Orientation

Hello Ryohei
Do we need IMU data which has orientation in it or does your algorithm fuse the angular velocaity and linear acceleration from the raw IMU data? I am asking this because my IMU does not give out orientation data.

cannot see the map in rviz

I add my own pcd map to the map_path parameter but I cannot see my map in rviz. I also play my dataset.db3 but I cannot see the path.
Could you tell me what happened? Thank U so much!

Localization worser then in your lidarslam_ros2

Hi @rsasaki0109,

thank you for your great work. I have a question.

We used your lidarslam_ros2 for mapping and localization and used then your pcl_localization_ros to test the localtization in the map. With the same rosbag and the map from lidarslam_ros2 as input.
Why is the localization so much worser. Did we forget something?

Thank you in advanced

Greeting
Samuel

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.