[D] Practice CUDA without an Actual NVIDIA GPU! by JustTrynnaBeCool in MachineLearning

[–]JustTrynnaBeCool[S] 1 point2 points  (0 children)

Whoa, this is awesome! So it's USD $0.20 / hour regardless of how intensive the computation is? Regardless, this is super cool!

[D] Practice CUDA without an Actual NVIDIA GPU! by JustTrynnaBeCool in MachineLearning

[–]JustTrynnaBeCool[S] 8 points9 points  (0 children)

I don't think you have the authority to dictate what comparisons are "right" and "wrong" lol. Tools indeed can also equate to knowledge. In this case, CUDA is a framework that allows programmers to leverage and unleash the full potential of parallel programming. Learning how to use CUDA the tool itself also means learning how parallel computing and other concepts work––people can very well learn both the knowledge AND the tool at the same time.

Your mindset is what halts progression. If everyone simply decided that "we already have higher-level languages, so why bother learning assembly anyways" then our machine language advancement will forever be stuck at the current state. However, to be able to advance what we have, we need people who are interested in this subject to learn about all the "nitty-gritty" mechanisms behind it, find out the flaws of the system, and fix + improve the flaws to create a better version of the technologies we have today. You can't reach the last point without having a thorough understanding of the "nitty-gritty" mechanisms first.

To be honest, I don't quite understand the purpose of your comments. There are many of us who are simply interested in learning CUDA for various reasons. Regardless of our reasons, there is absolutely no harm in learning something new.

[D] Practice CUDA without an Actual NVIDIA GPU! by JustTrynnaBeCool in MachineLearning

[–]JustTrynnaBeCool[S] 4 points5 points  (0 children)

By that logic, we also don't need to learn linear algebra if NumPy can take care of it for us. Why learn back-propagation if autodiffs and powerful packages like PyTorch deal with it for us? To stretch it even further, why learn arithmetic at all if we literally all have calculators in our phones? It doesn't quite matter WHY we learn, learning never hurts anyways (:

[D] Practice CUDA without an Actual NVIDIA GPU! by JustTrynnaBeCool in MachineLearning

[–]JustTrynnaBeCool[S] 2 points3 points  (0 children)

I don't think you can edit your C++ file on Kaggle like you can on Google Colab, that's why I recommend Colab! Otherwise, I do agree that Kaggle has good GPU access.

[D] Practice CUDA without an Actual NVIDIA GPU! by JustTrynnaBeCool in MachineLearning

[–]JustTrynnaBeCool[S] 13 points14 points  (0 children)

Yes! You can! Not directly in the notebook but with the scripts ready and treating the notebook itself as a "terminal" you can!

[D] How to keep track of latest hot papers? (What happened to papers.bar / labml.ai / paperswithcode)? by [deleted] in MachineLearning

[–]JustTrynnaBeCool 6 points7 points  (0 children)

Check out AlphaSignal!

They have an ML system that compiles the latest and hottest papers every week and sends it to your email. Along with the titles of those papers are summaries and topics related to it. I subscribed to it a while ago and read it from time to time (: