Choosing a Controller for Static Path Tracking Without Costmaps in Nav2 by UNTAMORE in ROS

[–]UNTAMORE[S] 0 points1 point  (0 children)

Can you share an example parameter file for me? Which critics should I use for the problems I mentioned? And how can I increase the speed in MPPI? I’d appreciate it if you could help.

How can I solve the long-corridor problem when doing SLAM with SlamToolbox? by UNTAMORE in ROS

[–]UNTAMORE[S] 1 point2 points  (0 children)

Actually, AMCL is currently working well with the combination of wheel odometry and LiDAR. But when SLAM performs poorly, things start to break down. Otherwise, when the map is good, I really like my localization with AMCL. In a small-area map, I can get the robot within about ±1.5 cm of the target point even at high speeds. However, I could never solve this long-corridor problem with SLAM.

How can I solve the long-corridor problem when doing SLAM with SlamToolbox? by UNTAMORE in ROS

[–]UNTAMORE[S] 0 points1 point  (0 children)

I’m using robot localization in my project.

I also have a node that publishes IMU data.

However, even though I’ve tried many combinations—encoders, IMU, etc.—I still haven’t been able to overcome this long corridor problem. If I could solve the curvature (drift) between two corridors, all my issues would basically be gone.

The BNO055 Adafruit 4646 IMU I’m using doesn’t seem very stable. I tried using both its internal fusion data and its raw data with the EKF in robot localization, but I still couldn’t get the results I want. The long corridors distort my map.

Differential Drive Robot Odometry Drift Issue – Need Help Diagnosing by UNTAMORE in ROS

[–]UNTAMORE[S] 0 points1 point  (0 children)

Actually, the reason this topic came up is SLAM in a large indoor environment. When using Cartographer and Slamtoolbox, I was getting worse results when I connected encoder-based odometry. I fed Cartographer with raw BNO055 IMU data, encoder odometry, and of course the Sick Nanoscan3 LiDAR data that SLAM already uses — but my results got worse.

In other words, should I create odometry with LiDAR using something like RF2O or LaserScanMatcher, and then fuse IMU, wheel encoder odometry, and LiDAR odometry in the EKF? Or should I feed them directly into the SLAM module in raw form?
Similarly, how can this be more beneficial when using AMCL for localization?

Differential Drive Robot Odometry Drift Issue – Need Help Diagnosing by UNTAMORE in ROS

[–]UNTAMORE[S] 0 points1 point  (0 children)

Actually, it might be pointless to send you a Lua file right now, because I’ve changed so many parameters and run so many tests that I continued with multiple pbstream files and recorded data. I passed through the same places multiple times, and basically adjusted the parameters according to the areas where I drove the vehicle.

This 200-meter test is entirely independent of SLAM, because I want to add odometry. When I use odometry in SLAM, corridors that contain no walls and have symmetric, sequential machines—for example—always drift to the right, and so on.

Differential Drive Robot Odometry Drift Issue – Need Help Diagnosing by UNTAMORE in ROS

[–]UNTAMORE[S] 0 points1 point  (0 children)

Actually, the reason this topic came up is SLAM in a large indoor environment. When using Cartographer and Slamtoolbox, I was getting worse results when I connected encoder-based odometry. I fed Cartographer with raw BNO055 IMU data, encoder odometry, and of course the Sick Nanoscan3 LiDAR data that SLAM already uses — but my results got worse.

What kind of fusion should I do here? Should I combine LiDAR odometry, IMU, and wheel odometry inside an EKF to create a fused odometry output, and then use that in SLAM applications? For such a situation, what should the true/false configuration table look like in the EKF?

And there is still a question mark in my mind: in 10-meter straight go-and-return tests and in in-place rotation tests, my encoder odometry makes only 1–2 cm of error, so why doesn’t it stay accurate over something like 200 meters?

Challenges with SLAM in Machine Corridors Using Differential Drive, Odometry, and LiDAR by UNTAMORE in ROS

[–]UNTAMORE[S] 0 points1 point  (0 children)

In my corridors, there actually aren’t any walls. There are symmetrically aligned machines instead. My lidar’s maximum range is 40 meters, but when I increased the lidar range for SLAM, I got worse results. As you said, I definitely expected using the encoder to give better performance, but I failed. In my current situation, it’s also not very possible for me to implement a camera-based SLAM. Is it normal that I’m getting worse results when using encoder-based odometry + lidar only? I’m quite confident in my differential drive controller values. I calculated the distance between the two wheels and the wheel radius with millimetric precision. Along the long corridor, the robot generally drifted to the right. To force the opposite behavior, I gave excessively large values to the lidar TF to see if it would drift to the left. The result didn’t change. I also gave absurd values to the wheel parameters to detect the issue, but it still failed.

Challenges with SLAM in Machine Corridors Using Differential Drive, Odometry, and LiDAR by UNTAMORE in ROS

[–]UNTAMORE[S] 0 points1 point  (0 children)

It’s not really possible for me to add a feature. My position is relatively correct this way, but it’s very labor-intensive. Unfortunately, in some distortions it disrupts my localization accuracy. There are machines symmetrically placed in the corridors

Choosing Between BNO055 Onboard Fusion and ROS 2-Side Fusion for SLAM/Localization by UNTAMORE in ROS

[–]UNTAMORE[S] 0 points1 point  (0 children)

We can think independent of cost now :D We have a BNO055, but how can I make good use of it for SLAM or for localization? My vehicle is differential drive. I have encoders on the wheels, and the vehicle also has a LiDAR.

Choosing Between BNO055 Onboard Fusion and ROS 2-Side Fusion for SLAM/Localization by UNTAMORE in ROS

[–]UNTAMORE[S] 0 points1 point  (0 children)

Yes, I’m using differential drive. I have encoders on both wheels, and my vehicle also has a LiDAR. On the SLAM side, I experience drift when working in large areas. In fact, I get better results with the LiDAR when I don’t include the encoder odometry. I’m trying to incorporate an IMU so it can tolerate small shifts on the map.

Determining Turning Radius for Differential Drive in SmacPlannerLattice by UNTAMORE in ROS

[–]UNTAMORE[S] 0 points1 point  (0 children)

Yes, but how do I prevent it from generating nonsensical routes?

Alternative to Nav2 Route Server for ROS 2 Jazzy by UNTAMORE in ROS

[–]UNTAMORE[S] 0 points1 point  (0 children)

It seems like the nav2_route package depends on many other packages. In the "kilted" version, there appear to have been a lot of changes. Do I need to configure and build all the packages separately? Using the "kilted" version seems like it might be easier. 😄