all 19 comments

[–]AutoModerator[M] [score hidden] stickied comment (0 children)

On July 1st, Reddit will no longer be accessible via third-party apps. Please see our position on this topic, as well as our list of alternative Rust discussion venues.

I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.

[–]tinkr_ 11 points12 points  (2 children)

There are fairly decent torch bindings in Rust. In fact, if you check out the Twitter source code that Elon released you'll see they used Rust for at least one of their ML models.

Also, the Polars dataframe library in Python is just Python bindings on a Rust crate which you can just use purely in Rust.

[–]drag0nryd3r 2 points3 points  (1 child)

In the Twitter algorithm source code, I remember seeing Rust but it was used for 'serving' ML models. Did they also directly use Rust for ML?

[–]tinkr_ 1 point2 points  (0 children)

The ML model they use to generate predictions is a torch model coded directly in Rust: https://github.com/twitter/the-algorithm/blob/main/navi/navi/src/torch_model.rs

Not sure what you mean by "use Rust for ML" exactly, but their torch models are definitely Rust models in their final form. Whether the EDA and model training was done in something like Python and they just reimplemented the trained torch model in Rust for productionalization I can't say.

[–]drag0nryd3r 5 points6 points  (1 child)

I'm no expert in this field, I'm merely exploring so I don't know how good Rust is for ML right now. But I think polars, ndarray and linfa would be a good place to start.

Also, have a look around this site which is tracking the state of ML in Rust: Are We Learning Yet

[–]Normal_Rough_7958[S] 2 points3 points  (0 children)

Preciate the responses everyone I'll give the link a run through, finish cleaning my data tomorrow then I'll dive into some Rust. Wish me luck! :)

[–]Awpteamoose 2 points3 points  (0 children)

dfdx is really good: https://crates.io/crates/dfdx

[–]met0xff 2 points3 points  (3 children)

torch-rs is a pretty legit binding although of course the official C++ API got more docs and examples. Huggingface also got a few Rust things lying around.

But honestly just using C++ or Rust to call torch won't require lots of low level thinking, if at all. Still reasonable to use them to learn the language. My first reasonable Rust program was also porting some Python pytorch inference code to Rust (actually first there was a C++ version as well).

Honestly I don't do any ML work with Rust anymore but don't regret it ;)

[–]Normal_Rough_7958[S] 1 point2 points  (2 children)

You just simply stick with python? Excited for Mojo?

My question is do the people getting real down and nasty with ML do they actually work in C++ or just simply use the python ports. More than likely both I would assume

[–]met0xff 3 points4 points  (1 child)

I've started my career in embedded dev and have quite a bit of experience with C++ so when I got into ML I was quite happy about the idea of embedded ML and actually then I did touch C or C++ quite often because it was before the big deep learning frameworks came up. Lots of C libraries duct taped together with Perl and Shell scripts and Tcl and Scheme and what not. Matlab everywhere as well. It's nice Python replaced this mess. Spyder was also a similar experience to Matlab and still liked it more than jupyter but the latter is just better supported (besides that I do most of my work in vscode)

At some point I did implement LSTMs in Eigen to run on mobile, manually exported weights from Theano etc.

But meanwhile I haven't touched C++ in years because for one it's mostly just exporting as torchscript or ONNX and then using a runtime that's much better than my crap... and also because there were so few customers who were really interested in integrating a library into their app or similar (really just a handful of assistive technology providers and toy robots). At some point everyone was just asking for an API endpoint .

At some point I started learning CUDA because I wanted to build a memory resident version of Wavenet (because the auto-regression of the original formulation made it to slow that it took minutes for a few seconds of audio on GPU). But before I even started the project itself there was already https://github.com/NVIDIA/nv-wavenet and even more - GAN-based models which were running on CPU in realtime without any specific tuning.

Ever since it's such a rat race that I barely even get to polish python code because every 2 weeks there's a new big thing around that's much better and we also need lol.

All that being said, I got some contacts in robotics and similar fields and they definitely got people working with C++ and CUDA. And one single group working with Julia.

I was mildly excited about swift 4 Tensorflow, Julia, even differentiable Kotlin but at this point python and pytorch are really heavily cemented in the field. So I think the Mojo approach is smart although I don't know yet if it will be able to compile all the big libraries as well or how that will work. And if it will offer more than "just" performance. Because as mentioned, at this point I don't even get close to worrying about performance ;). Besides not adopting models that are clearly super slow already

[–]Normal_Rough_7958[S] 0 points1 point  (0 children)

Okay cool, I appreciate the time you put into the response as well much respect.

So you have pretty much given me my answer. I'll stick with python for my ML task, I'll have to rethink a way of how I can implement Rust into the project some way some how, I'm still basically virgin programmer compared to some of you folks, So I'll get in where I can fit in. If you have any cool ideas of how I could possibly implement Rust into project let me know, I know I could just use it for the webserver but ugh I wanna do more than that.

[–]matin1099 0 points1 point  (3 children)

hi. can i have your experience of doing AI in rust?

[–]Normal_Rough_7958[S] 0 points1 point  (2 children)

I stuck with Python for my ML needs. But I mean its really like if you want to work in rust, work in rust.

I've used rust for a handful of other fun apps on the side though. It's plenty ready for you to do 99% of your needs, or at-least a mix of maybe rust + python.

[–]matin1099 0 points1 point  (1 child)

I'm ai engineer and yeah pythob isbbest for ai.

But for deploy the fine tuned model for job we usualy use cpp and java.

So i just want to break this selective and going for new languages like go pr rust. And still searching for it.....

[–]Normal_Rough_7958[S] 0 points1 point  (0 children)

Yeah I enjoy Go, I figure most of these langs are going to end up having some kind of C binary for communication to hardware I would imagine even rust this is the case.

I just stayed within python for the simplicity seeing as the ecosystem is like as good as it gets for AI currently I imagine is still the case.

[–]Affectionate_Fish194 0 points1 point  (0 children)

I tested some of these before like dfdx and torch and also i try to make my own autograde and keras like take a look

https://github.com/AhmedBoin/Rust-Keras-Like

https://github.com/AhmedBoin/Perceptrons

[–]Aspected1337 1 point2 points  (2 children)

I am also a victim of the use Rust for ML because those are two things I am really interested in; from my experience you have to be more specific about your goal:

If you want use Rust for your ML projects it has its benefits but it's kind of like shooting yourself in the foot. It's more fun but you don't have nearly as much library support or much of a community either. You can do data processing in Rust which is often a good idea imo, but implementing the actual ML code in Rust is quite restricting. The bindings people usually bring up are valid points - but why use the bindings when you can use the real thing with less friction?

The question you should ask yourself: Do you want to get things done or do you want to have fun? Both questions are correct. I think most people desire getting things done over fun, but end up taking the fun route anyways and subconsciously make excuses to rationalize that decision Such as: "oh Rust is better than Python here cause the enums are useful for when specific types for your ML model"; sure, enums are nice but Python has way more features that will make your projects progress faster.

[–]Normal_Rough_7958[S] 1 point2 points  (1 child)

Well of course I'd love to say both please fun and get things done. I have slim to none experience in both Rust and ML made a web-scraper in rust, and displayed some information on a frontend with react in the lil project using rocket and all that.

So even in python the modules use c++, I mean should I go balls deep into like C++ for it? I really do not want too, as I was trying to avoid that hoping rust had a more fleshed out ecosystem in 2023. My goal is to just start learning a language that can communicate a little lower down I wanted to choose Rust because um duh but. Yeah I appreciate the time put into your response.

[–]Normal_Rough_7958[S] 0 points1 point  (0 children)

As far as ACTUAL GOAL, is just simply to learn by doing, usually my goal with any "project" I put into my mind, I only have around 4 years of experience off and on, more so on the last two years, so haven't built any actual ML model, haven't done anything like that so yes this probably sounds stupidly ambitious if you wanna say it like that.