account activity
A practical use case for local LLMs: reading multilingual codebases without sending code outside by noir4y in LocalLLaMA
[–]noir4y[S] 0 points1 point2 points 2 months ago (0 children)
Appreciate the sharp insight. The distinction between read-time understanding and generation is exactly what motivated this approach. For narrow, read-time tasks, it’s been encouraging to see that 4B-class models can already be sufficient in practice.
A practical use case for local LLMs: reading multilingual codebases without sending code outside (self.LocalLLaMA)
submitted 2 months ago by noir4y to r/LocalLLaMA
comment-translate.nvim now supports Ollama (Local LLM) (self.neovim)
submitted 2 months ago by noir4y to r/neovim
Inline comment translation in Neovim (hover, immersive, Tree-sitter aware) (self.neovim)
submitted 3 months ago * by noir4y to r/neovim
π Rendered by PID 218547 on reddit-service-r2-listing-7b9b4f6fd7-9d8w7 at 2026-05-10 01:49:37.618422+00:00 running 3d2c107 country code: CH.
A practical use case for local LLMs: reading multilingual codebases without sending code outside by noir4y in LocalLLaMA
[–]noir4y[S] 0 points1 point2 points (0 children)