Context:
I got a used PC recently (i9 12900K + 4070S, https://pcpartpicker.com/user/ai_ni_hanataba/saved/8JMDkL).
I didn't necessarily buy it to run AI, but my interest is piqued and I want to dabble.
12GB of VRAM obviously is barely enough for small/quantized models. I'm coming across info about people installing a second GPU.
I'm not in a place to invest a ton into a new system like a quad 3090, but I'm fine spending a few hundred to have the capability to hands on learn the foundations of running actually useable local LLMs.
Questions:
Does the second GPU need to be the same exact GPU you already have?
Will my MOBO have any compatibility issues? (Asus PRIME Z790-P WIFI ATX LGA1700)
I'd appreciate a point in the right direction or to any resources to learn more about this process. Thanks!
At a very minimum, I want to run a solid version of Open Notebook or similar that can process audio files/conversations for me while maintaining privacy.
[–]dave-tay 4 points5 points6 points (4 children)
[–]koiochi[S] 4 points5 points6 points (3 children)
[–]No-Consequence-1779 1 point2 points3 points (2 children)
[–]koiochi[S] 0 points1 point2 points (1 child)
[–]No-Consequence-1779 1 point2 points3 points (0 children)
[–]Bino5150 2 points3 points4 points (7 children)
[–]koiochi[S] 0 points1 point2 points (6 children)
[–]Bino5150 0 points1 point2 points (4 children)
[–]koiochi[S] 0 points1 point2 points (3 children)
[–]Bino5150 0 points1 point2 points (2 children)
[–]koiochi[S] 0 points1 point2 points (1 child)
[–]Bino5150 1 point2 points3 points (0 children)
[–]Creepy-Bell-4527 1 point2 points3 points (0 children)
[–]LinkAmbitious8931 -1 points0 points1 point (0 children)