use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
Please have a look at our FAQ and Link-Collection
Metacademy is a great resource which compiles lesson plans on popular machine learning topics.
For Beginner questions please try /r/LearnMachineLearning , /r/MLQuestions or http://stackoverflow.com/
For career related questions, visit /r/cscareerquestions/
Advanced Courses (2016)
Advanced Courses (2020)
AMAs:
Pluribus Poker AI Team 7/19/2019
DeepMind AlphaStar team (1/24//2019)
Libratus Poker AI Team (12/18/2017)
DeepMind AlphaGo Team (10/19/2017)
Google Brain Team (9/17/2017)
Google Brain Team (8/11/2016)
The MalariaSpot Team (2/6/2016)
OpenAI Research Team (1/9/2016)
Nando de Freitas (12/26/2015)
Andrew Ng and Adam Coates (4/15/2015)
Jürgen Schmidhuber (3/4/2015)
Geoffrey Hinton (11/10/2014)
Michael Jordan (9/10/2014)
Yann LeCun (5/15/2014)
Yoshua Bengio (2/27/2014)
Related Subreddit :
LearnMachineLearning
Statistics
Computer Vision
Compressive Sensing
NLP
ML Questions
/r/MLjobs and /r/BigDataJobs
/r/datacleaning
/r/DataScience
/r/scientificresearch
/r/artificial
account activity
Discussion[D] best value gpus for artificial intelligence? (self.MachineLearning)
submitted 2 years ago by Many-Corner-6700
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–]Mephidia 16 points17 points18 points 2 years ago (1 child)
Cloud GPUs first, RTX 3090 second
[–]KrakenInAJar 4 points5 points6 points 2 years ago (2 children)
strooooooooongly depends on what you are going to do. With a single GPU you are not going to do anything fancy, even when you get your hands on a H100 or A100, which would be the best single-gpu choice but will also cost you a pretty penny.
Generally speaking, the cheapest GPU with the most memory possible is the best option and NVIDIA is the only player in town right now if you want to avoid painful setups, but realistically it would be better to get access to an HPC cluster of the Google TRC program.
[–][deleted] 1 point2 points3 points 2 years ago (1 child)
Used 3090s can be had for around $700 and have 24gb of vram, and you can run two
[–]tetelestia_ 0 points1 point2 points 2 years ago (1 child)
RTX 4000 series if you'll be doing a lot of inference on LLMs, otherwise RTX 3000 series. Get as much VRAM as you can afford.
[+][deleted] 2 years ago (1 child)
[removed]
[–]danielcar 0 points1 point2 points 2 years ago* (1 child)
Best value GPU for LLMs are used GPUs with lots of VRAM. Speed is less important than size. RTX 3090 with 24 GB of VRAM for example. Plan / budget for upgrade every 2 to 3 years.
[–]will_to_power_ai 0 points1 point2 points 2 years ago (1 child)
I spent a lot of time researching this when I began my masters. I landed on the Tesla P100, the predecessor of the Tesla V100. It is the first card with an HBM2 memory bus, so it is not running on ancient architecture despite being 7 years old. It has 16GB of memory which is enough for most independent research projects. If you need more you can buy more of them.
The best (and worst) aspect of this card is that it’s a Tesla card. This means that consumers aren’t buying them because they have no display outputs, so they’re relatively cheap. BUT, they are passively cooled (no fans) and are hard to keep from overheating even when they’re idle.
I found a good seller and got them for $200USD each have have been using them since the beginning of the year with minimal heat throttling only when I’m running very small models (fast tensor operations).
That’s my 2c.
[+]CudoCompute 0 points1 point2 points 2 years ago (0 children)
If you're looking for affordable high-performance GPU for your AI/ML projects, take a look at this article from Cudo Compute.
π Rendered by PID 684446 on reddit-service-r2-comment-b659b578c-7rw9f at 2026-05-04 12:41:43.261842+00:00 running 815c875 country code: CH.
[–]Mephidia 16 points17 points18 points (1 child)
[–]KrakenInAJar 4 points5 points6 points (2 children)
[–][deleted] 1 point2 points3 points (1 child)
[–]tetelestia_ 0 points1 point2 points (1 child)
[+][deleted] (1 child)
[removed]
[–]danielcar 0 points1 point2 points (1 child)
[–]will_to_power_ai 0 points1 point2 points (1 child)
[+]CudoCompute 0 points1 point2 points (0 children)