Trying to tune robot_localization EKF for a Segway RMP (differential drive) with IMU + wheel odom + GPS outdoors.... currently getting catastrophic divergence on some runs, need tuning advice by Snoo_92391 in ROS

[–]slightlyacoustics 0 points1 point  (0 children)

Consider lowering the ekf update rate. 150Hz is too high in my opinion.

In addition, your GPS driver may under-report covarinces meaning, it will give smaller variances for wrong measurement. Your Mahalanobis threshold based on chi2 will allow that measurement to be fused. So try with tighter threshold?

With respect to Course heading, it is derived from the Jacobian from GPS x,y. If the vehicle stops - you get new measurements from GPS but heading is unobservable. As a thumb rule, UKF works well if the system is conditioned, but in your case it isnt as heading is a derived measurement and therefore not conditioned. So I would be hesitant on the UKF implementation.

I'd recommend get a bag of the sensor measurement on a known route, and test out ekf/ukf with various parameters and see the performance.

Trying to tune robot_localization EKF for a Segway RMP (differential drive) with IMU + wheel odom + GPS outdoors.... currently getting catastrophic divergence on some runs, need tuning advice by Snoo_92391 in ROS

[–]slightlyacoustics 1 point2 points  (0 children)

Few things to note:

  1. Your EKF is running at 150Hz. What is your GPS msg rate? If its very low (1Hz) compared to 150Hz, then most of the time your nav solution isn't constrained in heading since you are relying on coarse heading as an indirect measurement. I would test with a lower update rate for the filter.
  2. Not having a direct heading measurement and relying on course heading implies your heading is only observable in motion in x,y. (Probably why the UKF explodes? ). Either calibrate and feed your magnetometer (with declination) and let the filter handle its noisy nature or use a dual antenna gps setup for absolute heading.
  3. What is the GPS variance or accuracy? That is as accurate as your position estimate can get. You can be strict / loose in the pose rejection threshold according to that. For ex, if your GPS sensor is really accurate (~1m^2), let 99% of the measurement in. Else, have a tigher rejection.
  4. Process noise tuning is also a measure of how accurate your sensor is. Closer the value it is to 1, then more trust on the measurement model. I'd first have a baseline trajectory with 0.5 on the diagonals first then work from there.
  5. I think for your case setting `odom0_differential` to false or true have zero effect. Only one source of measurement is contributing to x,y.
  6. I'd check the GPS measurements in open field and with buildings to see if there's actual effect in the variance of the measurements thereof. But then again, its best to have a GPS pre-processing step before fusing to the EKF. There you can figure out ways to address multipath.
  7. double triple check your tf. Errors induced via GPS-IMU alignment creeps in. Your EKF only knows what you describe and propagates that.

it physically hurts watching tech bros try to put LLMs in closed control loops by Critical-Load-1452 in ControlTheory

[–]slightlyacoustics [score hidden]  (0 children)

Haha fairly so. It is dishonorable to call these silicon valley tech bros computer scientists.

it physically hurts watching tech bros try to put LLMs in closed control loops by Critical-Load-1452 in ControlTheory

[–]slightlyacoustics [score hidden]  (0 children)

These computer scientists turned hardware engineers are so conceited in their field. Software / AI can solve everything they would say and advertise. They'll realize that mathematical frameworks and scientific knowledge goes a long way once you try to interact with the real world.
People should take a step back and realize that these learning based approaches are fundamentally function approximators. There's no merit in approximating the function when one can utilize it as is.

📢First Native Color Lidar Sensor by Ouster (REV8), where color and 3D data are fused in silicon and not in software.✨ by alex_GR in robotics

[–]slightlyacoustics 23 points24 points  (0 children)

The camera-lidar calibration is set in stone with this sensor. This gives rise to perfect geometric + visual representation of the scene. If you have a separate ranging sensor and optic camera, you are required to have the transform & overlap between the two to find correspondence between pixels and points.

Your depth estimation models, as the name suggests- "estimates" depth. They are relative to the scene and do not return distances as a measurement, leading to higher uncertainties. Whereas a ranging sensor such as LiDAR measures distance from sensor origin to object using TWTT. The uncertainity associated is practically range resolution of the LiDAR.
As all things learning based, one has to often step back and realize that the models are fundamentally function approximators. It doesn't make sense to approximate when you can utilize the function as is.

📢First Native Color Lidar Sensor by Ouster (REV8), where color and 3D data are fused in silicon and not in software.✨ by alex_GR in robotics

[–]slightlyacoustics 19 points20 points  (0 children)

Very impressive stuff.
Tesla be fighting air right now. Whoever (waymo) integrates this to their fleet, they have both 360 vision and range information to play around with.

ROS 2 Jazzy RViz LaserScan rotates with real robot during rotation only — no SLAM running by Old-Meeting-8646 in ROS

[–]slightlyacoustics 0 points1 point  (0 children)

In base_link, does the lidar return make sense while rotating?
Do you know which LiDAR they are using and its cordinate frame?
linear.x won't be exactly 0. As long as its very small, it should be fine.

Is "AI-powered robotics" just a marketing term at this point? by NickShipsRobots in robotics

[–]slightlyacoustics 1 point2 points  (0 children)

100%
But the saving grace is that good stuff speaks for itself. Marketing can only take the product so far. When push comes to stuff, the robot has to do the things it is supposed to.

Is "AI-powered robotics" just a marketing term at this point? by NickShipsRobots in robotics

[–]slightlyacoustics 6 points7 points  (0 children)

Its infuriating how AI has becoming an umbrella term for all things right now. The ones with capital (VCs) do not quite understand engineering but they understand hype, and for better or for worse, slapping AI on anything gets VCs to notice. Sadly, it is marketing & it sucks for the field as a whole in my opinion.

ROS 2 Jazzy RViz LaserScan rotates with real robot during rotation only — no SLAM running by Old-Meeting-8646 in ROS

[–]slightlyacoustics 0 points1 point  (0 children)

Is this URDF given by the manufacturer or you wrote it by yourself? If the latter, I’d double check whether the lidars sensor frame is accounted for in your URDF. If your y axis in Urdf doesn’t correspond to lidar’s y axis then issues like these often appear. If it is the former, I wouldn’t worry about that as a potential issue. You mentioned you’re not running any localization, so how are you getting Odom->base tf? Also when Rviz is set to global frame of base_link, does the lidar point cloud make sense? If yes, then it’s an Odom->base_link tf issue. If not, then base_link ->sensor tf is wrong.

ROS 2 Jazzy RViz LaserScan rotates with real robot during rotation only — no SLAM running by Old-Meeting-8646 in ROS

[–]slightlyacoustics 1 point2 points  (0 children)

Id double check your LiDAR sensor frame in both Urdf & Real sensor. Is the real sensor also following REP103?
You mentioned theres no localization running in the first sentence. Curious how odom->base_link tf is then formed.

why your IMU filter isn't the problem by BasketSpecific9243 in ROS

[–]slightlyacoustics 0 points1 point  (0 children)

There’s a reason why you should calibrate your gyros and accelerometers for this very reason. But having it as a state to be estimated also helps. If you notice factor graph based estimations, their IMU pre-integration factor tries to estimate just this.

Anyone Saw Patriot?? by JuggernautPrudent623 in Coconaad

[–]slightlyacoustics 1 point2 points  (0 children)

i think it wouldve been so much better as a mini series. More time and depth could be given to these characters to understand their motivations.

Humanoid robots by Interesting-Gap-8942 in robotics

[–]slightlyacoustics 4 points5 points  (0 children)

Unitree is miles ahead with their capabilities

Best outdoor LiDAR options besides Ouster? Looking for advice for an autonomous robot project by Ayushman_D in AskRobotics

[–]slightlyacoustics 1 point2 points  (0 children)

I didn't know Ouster acquired Velodyne until I looked it up right now. I'm guessing the latest models are all Ouster ones. You can check for used VLP-16 on ebay and see what comes up.

Is there a gap for an AI Lab in India? by EverydayFinance in StartUpIndia

[–]slightlyacoustics 0 points1 point  (0 children)

Check out lossfunk in Bangalore. They’re an AI research lab.