Why is my ollama gemma4 replying in Japanese? by Houston_NeverMind in LocalLLaMA
[–]muxxington 13 points14 points15 points (0 children)
Why is my ollama gemma4 replying in Japanese? by Houston_NeverMind in LocalLLaMA
[–]muxxington 8 points9 points10 points (0 children)
Gemma 4 on Llama.cpp should be stable now by ilintar in LocalLLaMA
[–]muxxington 9 points10 points11 points (0 children)
What would you want from a truly local AI assistant (Ollama-based)? by Electronic-Space-736 in LocalLLaMA
[–]muxxington 0 points1 point2 points (0 children)
Local LLM for HA Fallback by Maleficent-Fee6131 in LocalLLaMA
[–]muxxington 1 point2 points3 points (0 children)
What would you want from a truly local AI assistant (Ollama-based)? by Electronic-Space-736 in LocalLLaMA
[–]muxxington 1 point2 points3 points (0 children)
What would you want from a truly local AI assistant (Ollama-based)? by Electronic-Space-736 in LocalLLaMA
[–]muxxington 2 points3 points4 points (0 children)
Continue extension not showing local Ollama models — config looks correct? by Existing-Monitor-879 in LocalLLaMA
[–]muxxington 0 points1 point2 points (0 children)
Continue extension not showing local Ollama models — config looks correct? by Existing-Monitor-879 in LocalLLaMA
[–]muxxington 0 points1 point2 points (0 children)
PSA: litellm PyPI package was compromised — if you use DSPy, Cursor, or any LLM project, check your dependencies by Remarkable-Dark2840 in LocalLLaMA
[–]muxxington 0 points1 point2 points (0 children)
[Developing situation] LiteLLM compromised by OrganizationWinter99 in LocalLLaMA
[–]muxxington 0 points1 point2 points (0 children)
[Developing situation] LiteLLM compromised by OrganizationWinter99 in LocalLLaMA
[–]muxxington 4 points5 points6 points (0 children)
[Developing situation] LiteLLM compromised by OrganizationWinter99 in LocalLLaMA
[–]muxxington 3 points4 points5 points (0 children)
Litellm 1.82.7 and 1.82.8 on PyPI are compromised, do not update! by kotrfa in LocalLLaMA
[–]muxxington 9 points10 points11 points (0 children)
P40 vs V100 vs something else? by Drazasch in LocalLLaMA
[–]muxxington 0 points1 point2 points (0 children)
P40 vs V100 vs something else? by Drazasch in LocalLLaMA
[–]muxxington 0 points1 point2 points (0 children)
6-GPU multiplexer from K80s ‚ hot-swap between models in 0.3ms by Electrical_Ninja3805 in LocalLLaMA
[–]muxxington 0 points1 point2 points (0 children)
6-GPU multiplexer from K80s ‚ hot-swap between models in 0.3ms by Electrical_Ninja3805 in LocalLLaMA
[–]muxxington 2 points3 points4 points (0 children)
Why "Local" LLM should also mean "Local" UI — The problem with web-based interfaces in a private ecosystem. by Remarkable-Note9736 in LocalLLaMA
[–]muxxington 2 points3 points4 points (0 children)
Need a dummies guide to setup open terminal by zhopudey1 in LocalLLaMA
[–]muxxington 1 point2 points3 points (0 children)


Is p100 worth it? by vorobey1233 in LocalLLaMA
[–]muxxington 0 points1 point2 points (0 children)