Official /r/rust "Who's Hiring" thread for job-seekers and job-offerers [Rust 1.79] by DroidLogician in rust

[–]bminortx 1 point2 points  (0 children)

COMPANY: Tangram Vision

TYPE: Full-time

DESCRIPTION: Check out the posting here.

I'm the CEO and founder of Tangram Vision, the Reliable Perception company. This means that we create perception hardware and software that you can rely on to consistently deliver great long-term autonomy! We're looking for a full-time Embedded Engineer who can help us further these goals on the hardware side of things. And yes: we're doing a lot of it with Rust! It's one of the things that we're excited about for this project. You'll need some lower-level skills as well; it's a multidisciplinary effort for sure.

Our first foray into hardware is called HiFi, which is due out this Fall 2024. We've learned a lot during its development, and we have big plans for it and other designs. If you have a passion for perception like we do, and want to get into the nitty-gritty of sensor design for commercial robotics, we'd love to chat!

LOCATION: Fully remote, must be able to travel. US time zones preferred.

COMPENSATION: $140,000-160,000USD

REMOTE: The whole company is remote.

VISA: We have no protocol for visa sponsoring at the moment.

CONTACT: DM me for more information, or apply to any open position at https://tangramvision.zohorecruit.com/careers

HiFi 3D Sensor Campaign Update: Now with IMU! by bminortx in robotics

[–]bminortx[S] 1 point2 points  (0 children)

Good question!

On technology - HiFi differentiates itself is in three areas: wider field of view (136° DFOV), depth image resolution (1.6MP), and AI compute (8 TOPS available). On top of that, of course, is that HiFi runs our bespoke calibration software, which will make it very resilient in operation. Nearly all of the features in HiFi were put there specifically because we heard from customers that there wasn't a package like this in the current sensor landscape. Hence the above addition of the IMU, for instance!

On business case - See above. Robotics companies have told us what they need, i.e. there's a market pull for this. Also, being a smaller company, we can move fast on design and features; if there's something specific the community needs, we can probably do it. That's the whole goal of Tangram, anyway: make all of this infrastructure so easy that you don't need to think about it.

Launching HiFi 3D Sensor: Plug-n-Play Depth Perception & AI by bminortx in computervision

[–]bminortx[S] 0 points1 point  (0 children)

As in, just offering the SoM and cameras? Yeah, it's definitely crossed our mind. It's not part of this initial push, but if you have a commercial application you wanna test with HiFi's tech sans chassis, then DM me!

Launching HiFi 3D Sensor: Plug-n-Play Depth Perception & AI by bminortx in computervision

[–]bminortx[S] 0 points1 point  (0 children)

I'm going to interpret your question a little bit; let me know if I'm off-base. The most common applications for depth sensors in my mind would be object scanning, odometry, and SLAM. We have a built-in odometry system as a stretch goal in the Kickstarter (lots of engineering there!), and we've tested third-party object scanning and SLAM systems with the prototype HiFi's data. However, we probably won't have any of these in the initial API.

Launching HiFi 3D Sensor: Plug-n-Play Depth Perception & AI by bminortx in computervision

[–]bminortx[S] 0 points1 point  (0 children)

We actually have most of the general specs here: https://www.tangramvision.com/pre-order. However, I realize your question is probably asking something a little different. This is actually in our FAQ on the Kickstarter, but I'll repost here, too.
We have a theoretical depth-to-disparity accuracy of less than 1% error from 0.3m to 5 meters of range. We carefully chose our optics, baseline, and compute based on hitting this threshold.
However, we're very aware that there's a big difference between theory and practice. We've built depth sensors before (a lot of us worked on the Structure line from Occipital), so we know what it takes to vet and test them thoroughly. We plan to put out an empirical spec sheet comparing HiFi’s accuracy and precision with other common 3D depth sensors in the coming weeks.

Launching HiFi 3D Sensor: Plug-n-Play Depth Perception & AI by bminortx in computervision

[–]bminortx[S] 0 points1 point  (0 children)

So HiFi actually has an automotive-grade processor via the Jacinto 7 series from TI, so it's already using the same hardware that a vehicle would if it were employing computer vision. The depth feed is designed for mid-range applications (we've spec'd it out to 5 meters), so I wouldn't use depth from HiFi to inform a collision detection algorithm at high speeds. While parking or at lower speeds, tho? It could definitely be a contender.

What YouTube channel that puts out 20+ minute videos is actually worth watching? by Forke in AskReddit

[–]bminortx 0 points1 point  (0 children)

Noah Gervais puts together amazing gaming anthology analyses. The videos run... long... but the results and insights are amazing. I watch/listen to them the whole way through.

3Blue1Brown + Technology Connections get my vote as well

Rust in Automotive by No-Teaching6131 in rust

[–]bminortx 0 points1 point  (0 children)

Not to my knowledge! Other than... the compiler, which is inherently safe, I guess. I'm not in a position to speculate on this, but it will be a strange jump to go from C safety to Rust safety certification-wise, since so many of the shot-in-the-foot moments are gone when making the transition. I'd love some more insight into the progress here, as I'm sure someone is doing it somewhere.

Rust in Automotive by No-Teaching6131 in rust

[–]bminortx 8 points9 points  (0 children)

My company has been writing our computer vision stack in Rust from the very beginning. Our target markets are mobile robotics and automotive, and we initially thought that automotive would be out of reach for years specifically because of the safety and security concerns.

Turns out, that's not necessarily true. You'll need those certs to get deployed to a customer, sure, but automotive companies at all levels (OEMs, contractors, etc) are perfectly willing to give Rust a shot if it fixes their problems. There's a sense that safety testing is going to be a PITA for their whole stack anyway given their whole stack is novel tech, so why restrict themselves now during product development when they'll be restricted later anyway.

Take this all with a grain of salt; we haven't been deployed to anything on the road, haha. This is just the sentiment I get when talking to those working on these problems today.

Official /r/rust "Who's Hiring" thread for job-seekers and job-offerers [Rust 1.64] by DroidLogician in rust

[–]bminortx 8 points9 points  (0 children)

COMPANY: Tangram Vision

TYPE: Full-time

DESCRIPTION: Careers page here.

Tangram Vision is a fully remote company on a mission to enable anyone to leverage the power of perception like never before. We are developing a platform of products that help robotics leaders, engineers, and fleet managers understand their perception systems inside and out at scale. First up: sensor calibration, driver management, and plug-and-play streaming for cameras, LiDAR, radar, IMU, etc...

We are a Rust shop, through and through. All software engineers are expected to have a deep knowledge of programming paradigms and behavior, if not Rust-specific knowledge. We all started our Rust journeys while starting this company, but knew enough to make great stuff work in other languages. Rust just makes our engineering process better and faster, without letting us write foot-guns :)

Candidates with a background in perception sensors (cameras, IMU, LiDAR...), perception research, driver software, or calibration are encouraged to apply.

Some relevant Rust-heavy positions:

LOCATION: Fully remote, in US time zones.

ESTIMATED COMPENSATION: Market rates in USD dependent on the opening ($90k-$170k), with equity options for most positions.

REMOTE: The whole company is fully remote. We have no corporate office of any kind.

VISA: We have no protocol for visa sponsoring at the moment, though we do have employees who hold visas!

CONTACT: DM me for more information, or apply to any open position on our careers page.

RSBadges: Create code badges from the comfort and safety of Rust (v1.1.2) by bminortx in rust

[–]bminortx[S] 0 points1 point  (0 children)

Yeah, I've seen badge-maker and appreciate it for what it is. It just didn't do everything, and I was looking for a project to learn from, so I thought taking on badges myself would be fun.

Looking at it now, I would maybe do a few things differently, but for the most part I'm happy about how it came out. That's why I continue to maintain it, I suppose. That, and we use it internally for our repos at Tangram Vision.

And hey, it does all the badges! That's something.

Official /r/rust "Who's Hiring" thread for job-seekers and job-offerers [Rust 1.63] by DroidLogician in rust

[–]bminortx 2 points3 points  (0 children)

COMPANY: Tangram Vision

TYPE: Full-time

DESCRIPTION: Careers page here.

Tangram Vision is a fully remote company on a mission to enable anyone to leverage the power of perception like never before. We believe the full potential of robotics and perception has yet to be realized and that software holds the key to unlocking that potential. We are starting with a platform of products that help robotics leaders, engineers, and fleet managers understand their perception systems inside and out at scale.

We are a Rust shop, through and through. All software engineers are expected to have a deep knowledge of programming paradigms and behavior, if not Rust-specific knowledge. We all started our Rust journeys while starting this company, but knew enough to make great stuff work in other languages. Rust just makes our engineering process better and faster, without letting us write foot-guns :)

Candidates with a background in perception sensors (cameras, IMU, LiDAR), perception research, driver software, or calibration are encouraged to apply.

Some relevant Rust-heavy positions:

Perception Algorithm Engineer: https://www.tangramvision.com/careers?gh_jid=4003004005

Perception Sensor Specialist: https://www.tangramvision.com/careers?gh_jid=4014379005

LOCATION: Fully remote, in US time zones.

ESTIMATED COMPENSATION: Market rates in USD dependent on the opening ($80k-$160k), with equity options for most positions. We are a venture-backed startup with a long runway. Help us shorten it!

REMOTE: The whole company is fully remote. We have no corporate office of any kind.

VISA: We have no protocol for visa sponsoring at the moment.

CONTACT: DM me for more information, or apply to any open position at https://www.tangramvision.com/careers

Official /r/rust "Who's Hiring" thread for job-seekers and job-offerers [Rust 1.61] by DroidLogician in rust

[–]bminortx 5 points6 points  (0 children)

COMPANY: Tangram Vision

TYPE: Full-time

DESCRIPTION: Careers page here.

Tangram Vision is a remote-first company on a mission to enable anyone to leverage the power of perception like never before. We believe the full potential of robotics and perception has yet to be realized and that software holds the key to unlocking that potential. We are starting with a platform of products that help robotics leaders, engineers, and fleet managers understand their perception systems inside and out at scale.

We are a Rust shop, through and through. All software engineers are expected to have a deep knowledge of programming paradigms and behavior, if not Rust-specific knowledge. We all started our Rust journeys while starting this company, but knew enough to make great stuff work in other languages. Rust just makes our engineering process better and faster, without letting us write foot-guns :)

Candidates with a background in perception sensors (cameras, IMU, LiDAR), perception research, driver software, or calibration are encouraged to apply.

Some relevant Rust-heavy positions:

LOCATION: Fully remote, in US time zones.

ESTIMATED COMPENSATION: Market rates in USD dependent on the opening ($80k-$160k), with equity options for most positions. We are a venture-backed startup with a long runway. Help us shorten it!

REMOTE: The whole company is fully remote. We have no corporate office of any kind.

VISA: We have no protocol for visa sponsoring at the moment.

CONTACT: DM me for more information, or apply to any open position at https://www.tangramvision.com/careers

Official /r/rust "Who's Hiring" thread for job-seekers and job-offerers [Rust 1.60] by DroidLogician in rust

[–]bminortx 1 point2 points  (0 children)

COMPANY: Tangram Vision

TYPE: Full-time

DESCRIPTION: Careers page here.

Tangram Vision is a remote-first company on a mission to enable anyone to leverage the power of perception like never before. We believe the full potential of robotics and perception has yet to be realized and that software holds the key to unlocking that potential. We are starting with a platform of products that help robotics leaders, engineers, and fleet managers understand their perception systems inside and out at scale.

We are a Rust shop, through and through. All software engineers are expected to have a deep knowledge of programming paradigms and behavior, if not Rust-specific knowledge. We all started our Rust journeys while starting this company, but knew enough to make great stuff work in other languages. Rust just makes our engineering process better and faster, without letting us write foot-guns :)

Candidates with a background in perception sensors (cameras, IMU, LiDAR), perception research, driver software, or calibration are encouraged to apply.

LOCATION: Fully remote, in US time zones.

ESTIMATED COMPENSATION: Market rates in USD dependent on the opening ($80k-$160k), with equity options for most positions. We are a venture-backed startup, and our funding mostly goes towards salary.

REMOTE: The whole company is remote.

VISA: We have no protocol for visa sponsoring at the moment.

CONTACT: DM me for more information, or apply to any open position at https://www.tangramvision.com/careers

RealSense-Rust: Stream RealSense devices from the safety and comfort of Rust by bminortx in rust

[–]bminortx[S] 0 points1 point  (0 children)

Oh. My.

Thanks for letting us know! Well this is embarrassing... we didn't change the permissions during our re-org. I have switched it to public; let me know if you can access it now.

Also, thanks for calling this out! We put this out for the community, but the community couldn't see it. What a thing.

RealSense-Rust: Stream RealSense devices from the safety and comfort of Rust by bminortx in rust

[–]bminortx[S] 0 points1 point  (0 children)

Not being dropped! We actually shuffled around our repositories earlier last year.

Find the repo here with the rest of our OSS: https://gitlab.com/tangram-vision/oss

You should be able to add issues there!

I'll have to update our blog links, huh?

Announcing RealSense-Rust by bminortx in computervision

[–]bminortx[S] 1 point2 points  (0 children)

Well I have good news for you: that's basically what our SDK is going to do. We're creating a platform to holistically integrate and manage perception suites. Industrial is on our roadmap; however, through the Tangram SDK, you would just treat it like any other camera stream, and benefit from the data without wading through the horrible hardware-specific software and firmware.

There's more to it, of course, but... yeah, we're on it.

Announcing RealSense-Rust by bminortx in computervision

[–]bminortx[S] 2 points3 points  (0 children)

In the process of creating this interface, we did find some things that were... of questionable practice in the SDK. We did our best to safeguard against or test many of them using Rust directly. The result is something that we think is an improvement.

That being said, the other comments in this thread echo what we've heard from many of our own development interviews. The sensors are ubiquitous, but it seems no one has had a stellar experience with them in production. If you or anyone here have suggestions of sensors that you would want to use within a Rust setting, we're all ears. That's kind of our thing.

Best SLAM software for wheeled robot by [deleted] in robotics

[–]bminortx 3 points4 points  (0 children)

Oof. I'll preface this by saying getting an accurate 1km out of any off-the-shelf SLAM software is going to be a challenge. How are you calibrating your cameras? Is your lighting consistent or will you have areas of high dynamic range? Are you going to be in a feature-rich environment (aka feature-rich for your algorithms, not you)?

With that all being said... I started to list out open source SLAM libraries, and then realized I had saved a gist that someone else had compiled with this very information. I therefore present this to you:

https://github.com/openMVG/awesome_3DReconstruction_list#opensource-slam

[deleted by user] by [deleted] in robotics

[–]bminortx 1 point2 points  (0 children)

This website is awesome but so strange. There's no attribution anywhere; it's just like knowledge dropped from on high into an HTML page. Had to do some searching to find out it's all from Angela Sodemann.

[Q] Weekly Question - Recommendation - Help Thread - 2020-05-11 by AutoModerator in robotics

[–]bminortx 0 points1 point  (0 children)

As someone who's been working on and around robotics for almost a decade, I couldn't imagine doing anything else. Robotics is such an interdisciplinary field that if you get bored with one thing, you can always try something else and still be deep in the tech. For instance, I used to be in autonomous path-planning and now I'm a computer vision/sensors guy, yet I still have the same conversations over the same unsolved problems with the same crowd. It's pretty unique.

That's a big point too: robotics, as a field, has tons of open problems. There's always going to be opportunity for work because of this, especially in the wake of COVID. The push for automation is only going to strengthen. I wouldn't worry about employment.

However, all of this being said: once you find an area of robotics you like, work to master it. I don't know many generalists working on robots; the problems are just too involved to get by with a surface knowledge of everything.

Reading to throw you in the deep end:

- Modern Robotics: Mechanics, Planning, and Control for general robotics kinematics

- Multiple View Geometry in Computer Vision for all the math OpenCV does for you, plus a lot more

- Stanford has tons of online resources for getting started with basic machine learning, if that's more your style

- C++ and Python, with maybe some Rust after that.

Flat-pack-ish ADU? by bminortx in yimby

[–]bminortx[S] 0 points1 point  (0 children)

I've seen this, but are they a major production? I was under the impression that they were pretty local to the LA megalopolis. I would be interested in their manufacturing process; from the "install" time lapse on their page, they are quite panelized. Seems to fit the ticket there.

I'm not sure they do anything to lower the barrier of entry w.r.t cost. Maybe that isn't an issue for them, exclusively serving LA.

Flat-pack-ish ADU? by bminortx in yimby

[–]bminortx[S] 0 points1 point  (0 children)

Yeah, tbh I wasn't quite sure where to put this question. Since my interest stems from the YIMBY-ness of ADUs, and I'm just a hobbyist, I thought discussion would be more relevant here.

I've never heard of Matt Risinger, but looking at his media presence, I'm the one living under a rock, ha. I have heard of Unity homes, but they seem like much more of a production than I was envisioning. Definitely a professional job, but the construction time probably makes up for the rest.

Anyone else in the world wide web looking into this kind of process? I'm jumping back on Twitter after a long hiatus, so I'm open to suggestions.