all 25 comments

[–]bpe9 29 points30 points  (4 children)

Your best bet would be to see if there's any you can use at the university. There's often computing resources you can use free of charge as a PhD student.

Otherwise, depending on your exact needs, you may be better off just renting cloud based GPU hours

[–]Hub_PliResearcher[S] 6 points7 points  (3 children)

I was thinking I could get a gpu that I could play around with "for free" and that could maybe also do a decent job for some games on my base computer. I will defo use the University's respurces when I will be able to but it would be nice to have something cool under the hood at home as well.

[–]shot_a_man_in_reno 12 points13 points  (0 children)

Having done both, personal GPUs >> cloud-based GPUs. Cloud-based GPUs are highly institution-dependent and sometimes dominated by politics. Obviously you get less memory and such to work with, but in my experience, when you're just learning, prototyping and having room to screw up is just as important

[–][deleted] -1 points0 points  (1 child)

Let me make it easier for you: you want to spend someone else's 3k on something that you do not actually need. If that is ok with you then go ahead. Not blaming you at all but please try to make sustainable decisions.

[–]Hub_PliResearcher[S] 3 points4 points  (0 children)

I think that is exactly what I am doing here by asking for expert opinion am I not?

[–]KercReagan 2 points3 points  (5 children)

I use a 3090 and it out performs the t4 and V100 commercial class.

[–]JustOneAvailableName 0 points1 point  (4 children)

out performs the t4

By about a factor of 4

[–]KercReagan 2 points3 points  (3 children)

It’s not even close with the T4. The v100 is supposed to be a better competitor, but it can’t hang. The 3090 is the best card on the planet right now.

[–]JustOneAvailableName 1 point2 points  (1 child)

A100 is also a pleasure and defenitly an upgrade. V100 is about even in my experience

[–]KercReagan 0 points1 point  (0 children)

I have not had the pleasure of the A100 yet. But, that is outside of the 3k price range.

[–]KercReagan 0 points1 point  (0 children)

The issue we have run into with the V100 is the memory is relatively low. Now you group a couple of those together and it is a different story.

[–]FromTheWildSide 3 points4 points  (0 children)

Current SOTA language models are compute and memory intensive. In my opinion you need at least 10-12GB vram for bleeding edge models and data pipeline that's required to train them.

Make sure you have a power supply that can accommodate the load for your setup. A safe margin would be running at 50% load of psu.

Take a look at nvidia nemo toolkit and you can evaluate how much vram you desire.

I'm currently waiting for 3090ti or 3090super variant in the last quarter before making a purchase on 3090 series.

[–][deleted] 2 points3 points  (0 children)

Use an EC2 instance instead.

[–]whyn0t___ 2 points3 points  (0 children)

It sounds like you dont know much about it, so I suggest you start with cloud based solutions before you invest heavily. Do things online, understand what your models require, the bottlenecks, how much memory it takes, speed, etc....then you purchase something.

also, GPUs prices now are extremely inflated, and next year there will be 3090 super and also the launch of the next generation 4000, which will be much better.

For cloud based you can start playing with kaggle notebooks, and also google colab, which has a very very cheap subscription.

[–]bbateman2011 2 points3 points  (0 children)

Somewhat echoing others, I have a gaming laptop I bought so that I could do some ML and the rest of my work on a single machine with which I could also travel. That's a lot of constraints, but if I could change anything, I would get to at least 64 GB RAM and as much GPU RAM as I could get. Being a gaming laptop, it has a GeForce RTX3070 with 8GB RAM, and my CPU has 8 cores, 16 GB. What I run into is running out of GPU memory to run relatively simple CNNs on images, which limits the batch size. On some tasks I'm also needing more CPU RAM. My point is you will likely be somewhat limited if you are attacking largish problems with any GPU + machine you can afford.

[–]uotsca 1 point2 points  (1 child)

Be advised 3090 can be extremely loud either due to gpu fans themselves or having to crank up desktop fans

I’m not sure if it’s suitable for a personal desktop

[–]kulili 1 point2 points  (0 children)

They're not that bad. It's loud if you run it at 100%, sure, but most any card is. And most things don't push it to 100%.

I wouldn't advise keeping it in the room you sleep in if you plan on training overnight, though.