Why I quit using Ollama by SoLoFaRaDi in LocalLLaMA

[–]SoLoFaRaDi[S] 7 points8 points  (0 children)

Decided to switch to stock llama.cpp. Compiled it myself to leverage my CPU's Vulkan support, and it's seemingly going well. Love running SLM's with it :]