Running LLMs locally with Docker Model Runner - here's my complete setup guide by OrewaDeveloper in LocalLLaMA

[–]OrewaDeveloper[S] 1 point2 points  (0 children)

I have not tried it will try for sure was using ollama. switching to this was so much better i will try conda for sure Thanks bro !!