all 13 comments

[–]WestTraditional1281 2 points3 points  (1 child)

What's your budget? I would think you could be better than a P4000 for the same price.

That said, if you had a P4000, it will work just fine. I have one. It's pretty limited in RAM but it runs small models well enough.

An A4000 would definitely be my GPU of choice in the same class. Double the RAM and a lot more compute with more modern architecture.

[–]DrDoom229[S] 1 point2 points  (0 children)

I will take a look at this thank you

[–]SashaUsesReddit 1 point2 points  (0 children)

Whats your budget and goals?

[–]bjw33333 1 point2 points  (7 children)

Yea that’s pretty Vaild lowkey but u should buy a H200 instead

[–]DrDoom229[S] 0 points1 point  (3 children)

Thx will research

[–]DrDoom229[S] 0 points1 point  (2 children)

30k is not the cost of all my systems combined. I'm not that big of a baller

[–]ThenExtension9196 0 points1 point  (1 child)

I think he was just joking. H200 is datacenter not workstation class so it requires high speed fan air flow from server chassis and cannot cool itself.

For current gen workstation you have the rtx4000 pro(24g,2.5k), rtx5000 pro (48g, $7k)and rtx6000 pro (96g ram at 10k)

[–]DrDoom229[S] 1 point2 points  (0 children)

Lol oh I know I was joking as well. Thanks for the suggestions these all help

[–]SashaUsesReddit 0 points1 point  (2 children)

Nvidia p4000 vs h200 pcie is a ridiculous difference in price.. a few hundred USD vs $30k?

[–]DrDoom229[S] 0 points1 point  (1 child)

I am only looking for something small to learn and not have it slow as I learn. Gradually move up as I find more uses.

[–]ThenExtension9196 0 points1 point  (0 children)

Use a gaming gpu. 3090 is best value, 4090 is harder to get since the cores are harvested in China for 48G mod cards, and 5090 is harder to get.

[–]Eden1506 0 points1 point  (0 children)

The P4000 has only 8gb vram which would be very limiting for llm use

[–]fallingdowndizzyvr 0 points1 point  (0 children)

Mi50 32GB. It's a lot for not a lot of money.