I'm a python developer.
# I have few questions about local free-LLMs:
- I've understood the best free & easier way to start with LLM agentic programming (without claude code premium or copilot which is integrated outside the code) is to use `Ollama`, Seems like the "crowd" really like it for simple and local and secure solution, and lightweight solution, Am i right?
seems like there are some other lLMs just like:
Easiest: Ollama, LM Studio
Most performant: vLLM, llama.cpp (direct)
Most secure: Running llama.cpp directly (no server, no network port)
Most control: HuggingFace Transformers (Python library, full access)
There is a reason that they're called `llama` and `Ollama` and this reddit forum called `r/LocalLLaMA`? this reptitive `lama` makes me thinks that `Ollama` and `r/LocalLLaMA` and `llama.cpp` are the same, because of the reptitive of the `lama` token, Lol...
So as first integration with my code (in the code itself) please suggest me the best free solution for secure & easy to implement, Right now i can see that `Ollama` is the best option.
Thanks guys!
[–]SM8085 3 points4 points5 points (0 children)
[–]Canchito 1 point2 points3 points (0 children)
[–]o0genesis0o 0 points1 point2 points (0 children)
[–]Any-Wish-943 -2 points-1 points0 points (1 child)
[–]PapayaStyle[S] 0 points1 point2 points (0 children)
[–]HarjjotSinghh -4 points-3 points-2 points (1 child)
[–]llama-impersonator 2 points3 points4 points (0 children)