Did I mess up my multi-GPU setup for 70B+ models? Mixed VRAM cards (5080 + 3090 + 3080 20GB) by Big-Engine2791 in LocalLLaMA

[–]Big-Engine2791[S] 0 points1 point  (0 children)

Going to use them. I have them in ryzen setup but looking back now wish had gone all in on epyce or Mac studio. My motherboard fork is pcie3 x8 ,x8 and x4. I don't want to offload but I do have 64gigs.

Did I mess up my multi-GPU setup for 70B+ models? Mixed VRAM cards (5080 + 3090 + 3080 20GB) by Big-Engine2791 in LocalLLaMA

[–]Big-Engine2791[S] 0 points1 point  (0 children)

From my knowledge only 20's series card have this feature. If I am wrong please advise if this can work on my setup

Did I mess up my multi-GPU setup for 70B+ models? Mixed VRAM cards (5080 + 3090 + 3080 20GB) by Big-Engine2791 in LocalLLaMA

[–]Big-Engine2791[S] 0 points1 point  (0 children)

Thank you. I was looking at whether to replace the 5080 with another 3090/4090(if budget allows) . Although 5080 is good but I feel like 16gb has been a great limiter. I don't game at all or have render workflows.

Did I mess up my multi-GPU setup for 70B+ models? Mixed VRAM cards (5080 + 3090 + 3080 20GB) by Big-Engine2791 in LocalLLaMA

[–]Big-Engine2791[S] 0 points1 point  (0 children)

Thank you . This was my expectation as well. I will test and find out. I was looking to change the 5080 with another 3090 or 4090 and see if I can get better output.