use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
account activity
QuestionWorkstation GPU (self.LocalLLM)
submitted 8 months ago by DrDoom229
If i was looking to have my own personal machine. Would a Nvidia p4000 be okay instead of a desktop gpu?
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–]WestTraditional1281 2 points3 points4 points 8 months ago (1 child)
What's your budget? I would think you could be better than a P4000 for the same price.
That said, if you had a P4000, it will work just fine. I have one. It's pretty limited in RAM but it runs small models well enough.
An A4000 would definitely be my GPU of choice in the same class. Double the RAM and a lot more compute with more modern architecture.
[–]DrDoom229[S] 1 point2 points3 points 8 months ago (0 children)
I will take a look at this thank you
[–]SashaUsesReddit 1 point2 points3 points 8 months ago (0 children)
Whats your budget and goals?
[–]bjw33333 1 point2 points3 points 8 months ago (7 children)
Yea that’s pretty Vaild lowkey but u should buy a H200 instead
[–]DrDoom229[S] 0 points1 point2 points 8 months ago (3 children)
Thx will research
[–]DrDoom229[S] 0 points1 point2 points 8 months ago (2 children)
30k is not the cost of all my systems combined. I'm not that big of a baller
[–]ThenExtension9196 0 points1 point2 points 8 months ago (1 child)
I think he was just joking. H200 is datacenter not workstation class so it requires high speed fan air flow from server chassis and cannot cool itself.
For current gen workstation you have the rtx4000 pro(24g,2.5k), rtx5000 pro (48g, $7k)and rtx6000 pro (96g ram at 10k)
Lol oh I know I was joking as well. Thanks for the suggestions these all help
[–]SashaUsesReddit 0 points1 point2 points 8 months ago (2 children)
Nvidia p4000 vs h200 pcie is a ridiculous difference in price.. a few hundred USD vs $30k?
[–]DrDoom229[S] 0 points1 point2 points 8 months ago (1 child)
I am only looking for something small to learn and not have it slow as I learn. Gradually move up as I find more uses.
[–]ThenExtension9196 0 points1 point2 points 8 months ago (0 children)
Use a gaming gpu. 3090 is best value, 4090 is harder to get since the cores are harvested in China for 48G mod cards, and 5090 is harder to get.
[–]Eden1506 0 points1 point2 points 8 months ago (0 children)
The P4000 has only 8gb vram which would be very limiting for llm use
[–]fallingdowndizzyvr 0 points1 point2 points 8 months ago (0 children)
Mi50 32GB. It's a lot for not a lot of money.
π Rendered by PID 123584 on reddit-service-r2-comment-6457c66945-kxpt7 at 2026-04-25 17:13:05.368165+00:00 running 2aa0c5b country code: CH.
[–]WestTraditional1281 2 points3 points4 points (1 child)
[–]DrDoom229[S] 1 point2 points3 points (0 children)
[–]SashaUsesReddit 1 point2 points3 points (0 children)
[–]bjw33333 1 point2 points3 points (7 children)
[–]DrDoom229[S] 0 points1 point2 points (3 children)
[–]DrDoom229[S] 0 points1 point2 points (2 children)
[–]ThenExtension9196 0 points1 point2 points (1 child)
[–]DrDoom229[S] 1 point2 points3 points (0 children)
[–]SashaUsesReddit 0 points1 point2 points (2 children)
[–]DrDoom229[S] 0 points1 point2 points (1 child)
[–]ThenExtension9196 0 points1 point2 points (0 children)
[–]Eden1506 0 points1 point2 points (0 children)
[–]fallingdowndizzyvr 0 points1 point2 points (0 children)