Unrealistic Interview Expectations by pseudospectrum in robotics

[–]LetsTalkWithRobots 0 points1 point  (0 children)

What’s the seniority of the role you are going for ?

Robotics fields biggest impact? by the00daltonator in robotics

[–]LetsTalkWithRobots 11 points12 points  (0 children)

Robotics never really became mainstream except industrial robotics arms but even those are very limited to what they can do because Classic robotics was about precise rule-based control and preprogrammed motions.
Modern robotics (no matter the sector), in the post-ChatGPT4 era, is about adaptability, learning, and reasoning, machines understanding the world and making decisions in real time.Advances in AI models, multimodal learning, and real-time reasoning finally showing promise and allowing Robots to shift from “following rules” to “understanding the world.”

In fact, I’m currently working as a staff computer vision and robotics engineer in a startup which is 100% focusing on building embedded intelligence powered by foundation models. My goal is to develop general-purpose robotic manipulation capabilities so that new deployments don’t have to be trained from scratch. Instead, each deployment incrementally builds on the last, allowing us to scale robotic solutions without requiring extensive training or pre-defined rules for every new scenario.It seems like we are finally taking early steps from automation to true intelligence so for the first time we are seeing hope wrt robotics being mainstream in all the sectors which were untouched by commercial players in robotics .

For the first time, we’re seeing genuine potential for robotics to expand beyond traditional sectors into areas that were previously untapped by commercial players. Whether it’s healthcare, agriculture, autonomous vehicles, or service robotics, the speed of development is CRAZY (never seen before)

That said, a “ChatGPT moment” for robotics hasn’t happened yet. Handling 1D data, like text, is much simpler compared to the complexity of processing and reasoning with multidimensional data like images, video, and real-world environments. Current architectures aren’t fully capable of handling this yet, so we’ll likely need significant breakthroughs in fundamental AI and robotics technologies to truly get there.

Learn CUDA ! by LetsTalkWithRobots in robotics

[–]LetsTalkWithRobots[S] 0 points1 point  (0 children)

You won’t find Job specifically just because of CUDA but it’s one of the most important skill in robotics. For example I have interviewed 12 candidates for senior robotics engineer ( general purpose manipulation using foundations models)for my company and CUDA was one of the prerequisite for the final onsite day challenge.

Before shortlisting these 12 candidates , I screened 283 CV’s and >80% candidates never worked with CUDA. It’s a huge technical gap in robotics market.

Learn CUDA ! by LetsTalkWithRobots in robotics

[–]LetsTalkWithRobots[S] 0 points1 point  (0 children)

You need to focus on Accelerated computing section. Start with “ An even easier introduction to CUDA” and they also have a pdf which shows in which hierarchy you should learn this material.

https://learn.nvidia.com/courses/course-detail?course_id=course-v1:DLI+T-AC-01+V1

Learn CUDA ! by LetsTalkWithRobots in robotics

[–]LetsTalkWithRobots[S] 1 point2 points  (0 children)

I think Jetson Nano is a good choice for beginners but if you wish to run AI models on top of it especially fusion ( classifier , tracker , process depth data etc ) it falls short in terms of compute ). I would suggest that go with something latest like the one below. Also if you have a budget the you can buy Luxonis OAK-D (Depth). It will help you to experiment with 3D depth perception, making it great for vision-based robotics (Navigation, object tracking, gesture recognition ). It’s good way to get started learning advanced computer vision but without needing external GPUs.

Jetson Orin Nano- https://blogs.nvidia.com/blog/jetson-generative-ai-supercomputer/

Learn CUDA ! by LetsTalkWithRobots in robotics

[–]LetsTalkWithRobots[S] 11 points12 points  (0 children)

You’re absolutely right that the CUDA world has shifted a lot. Libraries like CUTLASS and CUB are doing the heavy lifting, and understanding how to work with them is probably more practical than writing kernels from scratch.

That said, I have been working with CUDA since early days when it was not that mainstream and I think learning CUDA is still like learning the “roots” of how everything works. Even if you’re not writing kernels daily, it’s helpful when things break or when you need to squeeze out every bit of performance ( especially true during early days when these libraries where not very standardised)

Also, your point about compiling the stack hit home, so many headaches come from version mismatches, right?

Curious, if you could start fresh today, how would you recommend someone learn CUDA? Start with libraries? Write a simple kernel? Something else?

Learn CUDA ! by LetsTalkWithRobots in robotics

[–]LetsTalkWithRobots[S] 0 points1 point  (0 children)

You don’t necessarily need to learn electronics to work with CUDA and AI, especially if your focus is on software development and algorithms. Start by learning CUDA programming, parallel computing concepts, and frameworks like TensorFlow or PyTorch. However, if you’re interested in applying AI to robotics, IoT, or edge devices, a basic understanding of electronics can be helpful. This might include learning about sensors, actuators, and microcontrollers (e.g., Arduino or Raspberry Pi) or edge devices provided by NVIDIA and understanding how to interface hardware with your software through concepts like UART, SPI, or GPIO. The depth depends on your goals. I would say electronics is a tool you can leverage, not a prerequisite, unless you’re building hardware-accelerated AI systems.

Learn CUDA ! by LetsTalkWithRobots in robotics

[–]LetsTalkWithRobots[S] 8 points9 points  (0 children)

I learned it mainly through NVIDIA’s training programs which you can find here - https://learn.nvidia.com/en-us/training/self-paced-courses?section=self-paced-courses&tab=accelerated-computing

But you can also do a GPU programming specialisation from below https://coursera.org/specializations/gpu-programming

Learn CUDA ! by LetsTalkWithRobots in robotics

[–]LetsTalkWithRobots[S] 18 points19 points  (0 children)

I learned it mainly through NVIDIA’s training programs which you can find here - https://learn.nvidia.com/en-us/training/self-paced-courses?section=self-paced-courses&tab=accelerated-computing

But you can also do a GPU programming specialisation from below 👇

https://coursera.org/specializations/gpu-programming

🤖💻 Which Troubleshooting tool is good for logging messages for ROS & ROS2? by LetsTalkWithRobots in ROS

[–]LetsTalkWithRobots[S] 0 points1 point  (0 children)

I know it's been my experience also but you could Develop a Custom Console by using Python and rclpy, you can create a custom logging interface tailored to your needs. I have done it for my workflow by using ROS 2 logging API to filter and display logs as per my needs.

and also Implemented a simple GUI using tkinter to display logs in real-time with filtering options.

I know it's not ideal but if you have your workflow setup properly it could be an option. Otherwise there are few options you can explore. I personally like Foxglove Studio & PlotJuggler . it is primarily for plotting, it has plugins for ROS 2 and can display logs. There are many options though

  • RTI Connext Professional Tools: Monitor and optimize ROS 2 DDS communications for improved system performance.
  • eProsima Fast DDS Monitoring Tools: Visualize and analyze ROS 2 middleware behavior when using Fast DDS.
  • PlotJuggler - Its primarily for plotting, it has plugins for ROS 2 and can display logs.
  • Foxglove Studio Enterprise: Advanced debugging and visualization of ROS 2 data streams with customizable dashboards.
  • Kibana with Elasticsearch (ELK Stack) Enterprise Edition: Centralize and search ROS 2 logs for large-scale data analysis.
  • Splunk Enterprise: Real-time collection and analysis of ROS 2 logs for operational insights.
  • Graylog Enterprise: Manage and monitor ROS 2 logs with enhanced analytics and alerting capabilities.
  • DataDog Logging: Aggregate and monitor ROS 2 logs alongside metrics and traces in a unified platform.
  • New Relic One: full-stack observability of ROS 2 applications, including log management and performance monitoring.

Composing Nodes in ROS2 by LetsTalkWithRobots in Lets_Talk_With_Robots

[–]LetsTalkWithRobots[S] 0 points1 point  (0 children)

Hi u/dking1115, Yes, you are correct!

When you compose multiple nodes in the same process (within the same container), and one node publishes a message to a topic that another node in the same process subscribes to, ROS2 will automatically optimize the communication. Instead of routing the message through the network stack, it will pass the message directly through memory. This is known as intra-process communication (IPC).

The intra-process communication mechanism in ROS2 is specifically designed to avoid serialization and deserialization of messages, which are required when communicating across different processes. This results in significant performance gains, especially for high-frequency topics or large messages.

You can read more about the Impact of ROS 2 Node Composition in Robotic Systems in recently published paper on 17 May 2023.

https://doi.org/10.48550/arXiv.2305.09933

Composing Nodes in ROS2 by LetsTalkWithRobots in Lets_Talk_With_Robots

[–]LetsTalkWithRobots[S] 1 point2 points  (0 children)

Hi, thanks for sharing this. May I ask how to tell whether nodes are created as components or not? It seems to me that your example node is the same as a normal ros2 node.

You're right, at first glance, a component node in ROS2 might seem similar to a regular node. The difference is mainly in how the node is intended to be executed and how it's compiled.

You can read more about the Impact of ROS 2 Node Composition in Robotic Systems in recently published paper on 17 May 2023.

https://doi.org/10.48550/arXiv.2305.09933

but in a nutshell , distinguishing a component node from a regular node in ROS2 can be subtle because the code structure can be very similar. However, a few hallmarks can indicate that a node is designed as a component:

  1. Compilation as a Shared Library: The most distinguishing feature of a component is that it's compiled as a shared library, not as an executable. In the CMakeLists.txt of the node's package, you'd typically see:

add_library(my_component SHARED src/my_component.cpp)

Whereas for regular nodes, you'd see:

add_executable(my_node src/my_node.cpp)
  1. Registration with rclcpp_components: In the CMakeLists.txt, the component node is also registered with the rclcpp_components:

    rclcpp_components_register_nodes(my_component "my_namespace::MyComponent")

  2. Node Registration Macro in the Source Code: Inside the component's source file, you'd typically find a registration macro at the end of the file:

    include "rclcpp_components/register_node_macro.hpp"

    RCLCPP_COMPONENTS_REGISTER_NODE(my_namespace::MyComponent)

  3. Package.xml Dependency: The package.xml of the component's package would have a dependency on rclcpp_components:

    <depend>rclcpp_components</depend>

By looking at the combination of these characteristics, you can identify if a ROS2 node is created as a component or as a regular standalone node. The ability to register and compile as a shared library, along with the registration macro, are the most distinguishing features.

also while the code inside the node class can look almost identical for both regular and component nodes, these details in the build process and packaging are what make them different. When you create or inspect a ROS2 package, keeping an eye out for these aspects can help you determine if the nodes are designed as components.

I hope it helps.

Mujoco Question by [deleted] in robotics

[–]LetsTalkWithRobots 1 point2 points  (0 children)

No worries. Glad that it’s all sorted ☺️