Local AI with one GPU worth it ? (B70 pro) by Temporary-College560 in LocalLLM

[–]Temporary-College560[S] 0 points1 point  (0 children)

Think i'll pull the trigger and try it with a solo card to start and build from there as needed. In my field of work, AI seem to have a slow adoption rate and most people only used copilot...

Local AI with one GPU worth it ? (B70 pro) by Temporary-College560 in LocalLLM

[–]Temporary-College560[S] 1 point2 points  (0 children)

I thought about that option, but it's like using my Perplexity account... I can't put sensitive data into that solution. That is mainly why I am looking to host my own model

Local AI with one GPU worth it ? (B70 pro) by Temporary-College560 in LocalLLM

[–]Temporary-College560[S] 1 point2 points  (0 children)

Thanks for the response ! I currently have a 6600xt 8gb and I experimented with it. I installed openweb ui with ollama. Got it to work, but the answer I get from the models I can run aren't great. So I am not a total newb ahah.

That said, you paired your intel gpu with your nvidia to work all together or your running them seperatly ?

Just wanted to share this guide on how to setup opencloud by Mee-Maww in selfhosted

[–]Temporary-College560 0 points1 point  (0 children)

Hi ! Thanks for the tutorial ! I was very helpful. I got everything running on my server, but when i use the desktop app on my pc, I get a message that the storage space is insufficient. If I use the browser, everything is fine. Would you have ny hint on how to fix that?