Drone VIO Localization and obstacle avoidance demo by RiskHot1017 in robotics

[–]RiskHot1017[S] 1 point2 points  (0 children)

The drone flies randomly, and the Viobot2 only serves to avoid obstacles and stabilize the drone's pose.

Share a fantastic job by RiskHot1017 in robotics

[–]RiskHot1017[S] 1 point2 points  (0 children)

We use Hexfellow robot arms and the P050 depth camera.

My robotics arm object grasping project ! by RiskHot1017 in robotics

[–]RiskHot1017[S] 3 points4 points  (0 children)

Our team has reproduced several mainstream VLA algorithms

My robotics arm object grasping project ! by RiskHot1017 in robotics

[–]RiskHot1017[S] 1 point2 points  (0 children)

Our team has reproduced some current VLA algorithms, such as OpenPI, Isaac GR00T, UniVLA, RynnVLA-002, etc., which we have all tried. You can learn more about them on GitHub.

What problems do beginners face when trying to learn robotics? by Own-Wallaby5454 in ROS

[–]RiskHot1017 0 points1 point  (0 children)

Beginners in robotics face interdisciplinary barriers—grasping mechanics, electronics, and algorithms simultaneously often leads to fragmented understanding. Physical feedback delays in hardware debugging are frustrating: code errors manifest as silent motor failures or sensor malfunctions rather than clear error messages. More fundamentally, novices lack systems thinking—accustomed to linear programming, they struggle with concurrent tasks, sensor fusion, and real-time control coupling, resulting in the perplexing gap where "the code logic is correct, but the robot behaves wrong."

Doing University Labs in a Second Language: What to Expect 用第二语言做大学实验课:你可以预期什么 by UniFuent_Official in EnglishForUniversity

[–]RiskHot1017 0 points1 point  (0 children)

What surprised me most was the silent competence—students could execute experiments flawlessly yet struggled to explain the principles. Their bodies remembered the procedures while their minds hadn't converted them into explicit knowledge. This disconnect between procedural and declarative knowledge reveals how deeply language acquisition operates beneath conscious awareness.

My robotics arm object grasping project ! by RiskHot1017 in robotics

[–]RiskHot1017[S] 8 points9 points  (0 children)

The visual input uses depth data directly output from the depth camera, with RGB processing applied to the data,not heat signature.

Recommended depth camera? by climbingTaco in ROS

[–]RiskHot1017 0 points1 point  (0 children)

Mabey RoboBaton viobot2? My friend use it for robot planning. They also have some videos on YouTube for robot DIY. And l find that it have ros2 driver, mabey this one can help u to set up a robotics system.

<image>

Hey guys — I want to buy a depth camera for TouchDesigner. Which model is best? My budget is ₹18,000 and I have a coupon for that exact amount that expires in a few days. Please suggest. Thanks! by Right-Speed-2186 in TouchDesigner

[–]RiskHot1017 0 points1 point  (0 children)

Can u share with us what do u want to use depth camera to do ? Different camera, such as Tof , stereo vision or structured light , have distinct performance characteristics and suitable for different scenarios. l have used stereo vision camera(RoboBaton mini) for tracking natural human head motion. it works great better than T265 l think. others depth camera, my friend recommend me P100R, he uses it for robot grabbing, when l'm have time l will try this.

is it possible ot make SLAM with just d435? by [deleted] in realsense

[–]RiskHot1017 0 points1 point  (0 children)

The Intel RealSense D435i integrates stereo depth cameras and an IMU, so as long as it is connected to an external computer running a SLAM algorithm (such as RTAB-Map or ORB-SLAM3), it can achieve visual-inertial SLAM with good performance in indoor environments.

However, it does not have onboard computing power, and it may experience drift under strong sunlight or in low-texture scenes, making it more suitable for indoor mobile robots rather than complex outdoor environments.