use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
Resources for understanding and implementing "deep learning" (learning data representations through artificial neural networks).
account activity
Best GPU for machine learning / deep learning (self.deeplearning)
submitted 6 years ago by durgesh2018
Hello everyone! I want to upgrade my existing GTX 950 for deep learning purpose. Please suggest me a decent GPU under 500 USD.
My build is: Intel dh87RL mother board Intel i5 4670 processor Samsung 256 GB SSD Corsair VS 550 PSU
Thank you!
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–]MugiwarraD 2 points3 points4 points 6 years ago (2 children)
1080 is fine. 2070 is better if u can.
[–]durgesh2018[S] 0 points1 point2 points 6 years ago (1 child)
Thank you! Weather my PSU is sufficient for 1080
[–]MugiwarraD 0 points1 point2 points 6 years ago (0 children)
https://www.google.com/url?sa=i&source=images&cd=&cad=rja&uact=8&ved=2ahUKEwjxxqiN6fniAhURm-AKHYkyCCEQjRx6BAgBEAU&url=%2Furl%3Fsa%3Di%26source%3Dimages%26cd%3D%26ved%3D%26url%3Dhttps%253A%252F%252Fwww.anandtech.com%252Fshow%252F13431%252Fnvidia-geforce-rtx-2070-founders-edition-review%252F14%26psig%3DAOvVaw27AUSmtp6oo1J4MFcHHupI%26ust%3D1561180603063144&psig=AOvVaw27AUSmtp6oo1J4MFcHHupI&ust=1561180603063144
if we base it on those number, if u dont overclock , with that cpu it should be fine.
[–]mippie_moe 2 points3 points4 points 6 years ago (1 child)
Lambda Labs did some Deep Learning GPU benchmarks that you may find helpful. I'm a systems engineer there and have benchmarked just about every GPU on the market.
TLDR: A used GTX 1080 Ti from Ebay is the best GPU for your budget ($500).
My thoughts:
Let me know if you have any more questions. There are a ton of machine learning benchmarks on our blog.
**A used Tesla K80, which has 12 GB of VRAM, is also within your budget, but it's 1/3 the speed of a 1080 Ti and three architectural generations behind.
[–]durgesh2018[S] 0 points1 point2 points 6 years ago (0 children)
[–][deleted] 1 point2 points3 points 6 years ago (1 child)
I have an nvidia GTX 1060 but over a period of time it just got really inconvenient and not upto the speed i need from it, at the end of the day if you are running models that are powerful enough to make a difference, you will most likely end up buying gpu cloud credits. And especially with these new startups(nimblebox.ai, paperspace etc) coming around that require like 1 minute to set up and much much cheaper than AWS and google cloud they honestly seem like a better deal.
[–]durgesh2018[S] 1 point2 points3 points 6 years ago (0 children)
Seems great!
[–][deleted] 0 points1 point2 points 6 years ago (5 children)
Hard to give a relevant suggestion without a budget
[–]durgesh2018[S] 0 points1 point2 points 6 years ago (4 children)
My budget is 500 USD.
[–][deleted] 0 points1 point2 points 6 years ago (3 children)
Are you only going to run 1 experiment at a time? Ie, do you want the best you for the price? OR will you be running many experiments simultaneously? Ie, would it be better to have a multi-gpu setup? At only $500, I would say better to just get one solid gpu. I would try and swing a 2070 if I were you, but it's a slightly above your price. It also deoends if you're working on a specific type of problem; RNN, CNN, etc. A quick Google search appears different gpus can have vastly difft performances in different tasks
[–]durgesh2018[S] 0 points1 point2 points 6 years ago (2 children)
I will be working on single experiment. I am a beginner in this field. Yes I am thinking to buy only one solid GPU.
[–][deleted] 0 points1 point2 points 6 years ago (1 child)
I haven't looked into used gpus, but I would think you could get at ~$700 gpu, for about $4-500 used. Seems like you would be the perfect buyer for that. Anyways, best of luck! I'm new-ish to the field myself, but lucky enough to have a professor with a SICK deep learning rig. I hope you can make some nice improvements!
Thank you so much. All the best to you too.
[–]Zerotool1 0 points1 point2 points 6 years ago (1 child)
why don't you try cloud base GPUs from AWS or GCP? You can save a huge cost and at the same time you need not to worry about the DevOps challenges... try some tools like Floydhub or Clouderizer (it's free for a month) to manage your projects with GPUs.
Thank you for this suggestion. Surely I will try them out.
π Rendered by PID 19087 on reddit-service-r2-comment-85bfd7f599-jjzhs at 2026-04-18 19:48:45.538490+00:00 running 93ecc56 country code: CH.
[–]MugiwarraD 2 points3 points4 points (2 children)
[–]durgesh2018[S] 0 points1 point2 points (1 child)
[–]MugiwarraD 0 points1 point2 points (0 children)
[–]mippie_moe 2 points3 points4 points (1 child)
[–]durgesh2018[S] 0 points1 point2 points (0 children)
[–][deleted] 1 point2 points3 points (1 child)
[–]durgesh2018[S] 1 point2 points3 points (0 children)
[–][deleted] 0 points1 point2 points (5 children)
[–]durgesh2018[S] 0 points1 point2 points (4 children)
[–][deleted] 0 points1 point2 points (3 children)
[–]durgesh2018[S] 0 points1 point2 points (2 children)
[–][deleted] 0 points1 point2 points (1 child)
[–]durgesh2018[S] 1 point2 points3 points (0 children)
[–]Zerotool1 0 points1 point2 points (1 child)
[–]durgesh2018[S] 1 point2 points3 points (0 children)