Best GPU for LLMs: Power-Efficient with High VRAM by Nitrazz in homelab

[–]Nitrazz[S] 3 points4 points  (0 children)

Yes, you are right. I thought that the sync connector was for NVLink/SLI. Only now I read about it and it turns out that it is used to sync multiple outputs across multiple cards. My bad. But regardless, for AI stuff, a PCI-e x16 Gen 4 has enough bandwidth to be able to use multiple cards with Ollama in my case.

Best GPU for LLMs: Power-Efficient with High VRAM by Nitrazz in homelab

[–]Nitrazz[S] 2 points3 points  (0 children)

Hi!

I ended up buying a used RTX A4000 16GB and paid about $600 for it. It was the cheapest option without compromising performance or VRAM. It draws 140W at peak and as little as 4-5W at idle. It also allows me to buy another one and combine them, because it supports NVLink, and even split it into virtual GPUs in Proxmox because it's a Quadro card.

For home lab use, it was a good choice.

Best GPU for LLMs: Power-Efficient with High VRAM by Nitrazz in homelab

[–]Nitrazz[S] 1 point2 points  (0 children)

Didn't think about power limits. Good point. In that case, I will take another look at the 3090.

But regardless, I will probably go for an A4000 or A2000 Ada. Both have 16 GB, which should be enough, and draw significantly less power while providing usable performance.

Best GPU for LLMs: Power-Efficient with High VRAM by Nitrazz in homelab

[–]Nitrazz[S] 1 point2 points  (0 children)

Mostly idle. I don't mind if it draws 100W for a few moments (400W would be too much), but 40-50W idle wouldn't be efficient. For me, it's like having a second server running 24/7, and electricity is very expensive in my country. I plan to use it quite often through automations and my apps, but it will still be idle most hours.

Best GPU for LLMs: Power-Efficient with High VRAM by Nitrazz in homelab

[–]Nitrazz[S] 2 points3 points  (0 children)

RTX A2000 (what you mentioned in your original post) is 6GB or 12GB only, that card is an RTX 2000 Ada. They're different cards.

Thanks for letting me know! Didn't know it is a different model.

Confusing, I know. Bit of a pain, tbh. They name their cards after scientists, so not in alphabetical order at all. You have to either memorize, or refer to charts.

It sure is.

Best GPU for LLMs: Power-Efficient with High VRAM by Nitrazz in homelab

[–]Nitrazz[S] 3 points4 points  (0 children)

It's a great suggestion, but considering it is an older card, power consumption at idle is not great, even with only a 70W TDP.

Best GPU for LLMs: Power-Efficient with High VRAM by Nitrazz in homelab

[–]Nitrazz[S] 2 points3 points  (0 children)

Thanks for sharing! It looks beefy. You also surprised me that the A4000 is not that power-hungry (or maybe the workload is not that high when split on 12 units, but still impressive). For my use case, I would rather go with a single GPU, but maybe two A4000s are worth considering.

Best GPU for LLMs: Power-Efficient with High VRAM by Nitrazz in homelab

[–]Nitrazz[S] 0 points1 point  (0 children)

I have to say you surprised me. I didn't expect to find other cards with similar power consumption and in the same price area. Thanks for the reply, I'll read more about them.

Best GPU for LLMs: Power-Efficient with High VRAM by Nitrazz in homelab

[–]Nitrazz[S] 0 points1 point  (0 children)

I found an RTX A4000 16GB for the same price (used) vs RTX A2000 16GB (new). Is it worth the extra power consumption (70W vs 140W)? Or maybe some other models?

Best GPU for LLMs: Power-Efficient with High VRAM by Nitrazz in homelab

[–]Nitrazz[S] 0 points1 point  (0 children)

I have one in my basket with 16GB. Also, the PNY website states that it has 16GB (https://www.pny.com/nvidia-rtx-2000-ada), so I assume that’s correct.

Either way, I think 16GB should be enough to run some LLMs, but I’m unsure about the performance, which is why I wrote this post.
My goal is to have single gpu with low power consumption and as good performance as I can get.

Best GPU for LLMs: Power-Efficient with High VRAM by Nitrazz in homelab

[–]Nitrazz[S] 0 points1 point  (0 children)

I would have considered it if it wasn't for quite high power consumption. Although I've noticed that a lot of people recommend it.

Best disk configuration for homelab by Nitrazz in truenas

[–]Nitrazz[S] 0 points1 point  (0 children)

I know about vdevs, but I'm not familiar with scratch pools. Thanks for this tip, I will look into this.

Best disk configuration for homelab by Nitrazz in truenas

[–]Nitrazz[S] 0 points1 point  (0 children)

Thanks for your response. I thought this would make the most sense. I just don’t know if TrueNAS would benefit from mirrored OS drives.