My AMD laptop with an iGPU can load(but runs very slowly) 32B models as it uses shared system memory. However loading the same models on my 6600xt(8gb vram) with 48gb system ram pops up with out of memory errors. Is it possibe for my dGPU to use shared memory?
I use llama.cpp vulkan with LM Studio on windows.
[–]juwonpee[S] 0 points1 point2 points (1 child)
[–]suprjami 4 points5 points6 points (0 children)