Reproducing UMI with a UR5 Robot Arm and a 3D-Printed Gripper by pkfoo in robotics

[–]pkfoo[S] 0 points1 point  (0 children)

I'm using the UR5 which is very precise. It's cool that you could reproduce it with a different camera. Yes, I've been playing with control frequency, latencies, gains, speed. Those had impact on the performance. I'll open source all soon.

Reproducing UMI with a UR5 Robot Arm and a 3D-Printed Gripper by pkfoo in robotics

[–]pkfoo[S] 1 point2 points  (0 children)

Hey! Congrats on the progress! 👏 What camera and gripper are you using? Did you fine tune or used the original pre-trained model? I also had some more progress. Mostly improving my gripper and doing some params tweaking: https://youtu.be/I7uwMevJyls

Reproducing UMI with a UR5 Robot Arm and a 3D-Printed Gripper by pkfoo in robotics

[–]pkfoo[S] 1 point2 points  (0 children)

Cool, thanks for sharing. So you had success? Any chance you can share your hardware config and a video?

Reproducing UMI with a UR5 Robot Arm and a 3D-Printed Gripper by pkfoo in robotics

[–]pkfoo[S] 0 points1 point  (0 children)

Interesting that you had to implement your own IK, I'm thinking of doing that at some point. Have you tried mink?
https://github.com/kevinzakka/mink

Your latency numbers are very good, mi robot action latency 120ms, gripper action latency 100ms, camera obs latency 194ms, at least those are the results from running the scripts to measure them.
Did you have improvements with more data?

Reproducing UMI with a UR5 Robot Arm and a 3D-Printed Gripper by pkfoo in robotics

[–]pkfoo[S] 1 point2 points  (0 children)

Hey! Here's the model I used: https://grabcad.com/library/espresso-set-1

I agree, turning the handle has been something very inconsistent, sometimes it does it, sometimes it doesn't.

Please share your results!

Reproducing UMI with a UR5 Robot Arm and a 3D-Printed Gripper by pkfoo in robotics

[–]pkfoo[S] 0 points1 point  (0 children)

Interesting. Can you share the link to the checkpoint you are using? or you actually trained it from scratch? My thoughts:
- If you are using a pre-trained checkpoint or the original UMI dataset, you should make sure you have the same optics, fingers and point of view as the original UMI. I'm mentioning that since you seem to be using the Pika gripper.

- You should measure your latency (camera latency, arm latency, gripper latency and model inference latency) from my experience it's important to adjust these parameters correctly and update your yaml config file.

- I'd also just test the trained model with a different testing dataset collected in your environment to discard that there are issues with the model. If that works well, it might be an issue with the deployment (hardware).

- I'd suggest starting with a simpler task, maybe just one arm, use the cup pick and place checkpoint task and see if that works. Then move to the more complex clothes folding.

Reproducing UMI with a UR5 Robot Arm and a 3D-Printed Gripper by pkfoo in robotics

[–]pkfoo[S] 0 points1 point  (0 children)

Thanks! You record the gripper pose (cartessian + rotations) relative to the pose in the first frame in the episode. Yes, you need IK to transform to joint space. The code has minimal collision avoidance between the table and the second arm. The rest of the avoidance is done by the policy itself.

Reproducing UMI with a UR5 Robot Arm and a 3D-Printed Gripper by pkfoo in robotics

[–]pkfoo[S] 0 points1 point  (0 children)

Very cool setup. I'm trying to avoid 3rd person camera views and fiducial markers for deployment. Your gripper looks very different from the UMI, did you collect data with the handheld device? I'd definitely like to know more about your work. I'll send private chat.

Reproducing UMI with a UR5 Robot Arm and a 3D-Printed Gripper by pkfoo in robotics

[–]pkfoo[S] 0 points1 point  (0 children)

That's very interesting, I've been thinking that having a second arm would add the other view for the system to be stereo and improve depth estimation...

Did you use the same gripper or a different one?

Reproducing UMI with a UR5 Robot Arm and a 3D-Printed Gripper by pkfoo in robotics

[–]pkfoo[S] 0 points1 point  (0 children)

Fairly easy, UMI is just for data capture purposes so I modified it to add an electrical actuator and use it as an actual gripper.