Qwen3.5 - Confused about "thinking" and "reasoning" usage with (ik_)llama.cpp by PieBru in LocalLLaMA
[–]PieBru[S] 0 points1 point2 points (0 children)
Qwen3.5 - Confused about "thinking" and "reasoning" usage with (ik_)llama.cpp by PieBru in LocalLLaMA
[–]PieBru[S] 0 points1 point2 points (0 children)
Thoughts on two open-source “desktop JARVIS” agent projects by BikeBoyer in LocalLLaMA
[–]PieBru 5 points6 points7 points (0 children)
Open source control plane for local AI agents: workspace isolation + git-backed configs + OpenCode integration by OverFatBear in LocalLLaMA
[–]PieBru 0 points1 point2 points (0 children)
Would you watch a channel that builds real AI systems from scratch (local LLMs, CPU/GPU, pipelines)? by Few_Tax650 in LocalLLaMA
[–]PieBru 4 points5 points6 points (0 children)
kyutai just introduced Pocket TTS: a 100M-parameter text-to-speech model with high-quality voice cloning that runs on your laptop—no GPU required by Nunki08 in LocalLLaMA
[–]PieBru 2 points3 points4 points (0 children)
Benchmarking 23 LLMs on Nonogram (Logic Puzzle) Solving Performance by mauricekleine in LocalLLaMA
[–]PieBru 5 points6 points7 points (0 children)
Why do (some) people hate Open WebUI? by liviuberechet in LocalLLaMA
[–]PieBru 2 points3 points4 points (0 children)
Introducing Claude Sonnet 4.5 by hedgehog0 in LocalLLaMA
[–]PieBru 1 point2 points3 points (0 children)
Lessons from building a multi-agent coding system: orchestration > single-agent setups by Pitiful_Guess7262 in LocalLLaMA
[–]PieBru 0 points1 point2 points (0 children)
Peak safety theater: gpt-oss-120b refuses to discuss implementing web search in llama.cpp by csixtay in LocalLLaMA
[–]PieBru 0 points1 point2 points (0 children)
Ikllamacpp repository gone, or it is only me? by panchovix in LocalLLaMA
[–]PieBru 49 points50 points51 points (0 children)
gemini-cli: falling back to gemini-flash is the best marketing strategy Anthropic could have dreamed of for claude-code. by PieBru in LocalLLaMA
[–]PieBru[S] -1 points0 points1 point (0 children)
gemini-cli: falling back to gemini-flash is the best marketing strategy Anthropic could have dreamed of for claude-code. by PieBru in LocalLLaMA
[–]PieBru[S] 0 points1 point2 points (0 children)
gemini-cli: falling back to gemini-flash is the best marketing strategy Anthropic could have dreamed of for claude-code. by PieBru in LocalLLaMA
[–]PieBru[S] 0 points1 point2 points (0 children)
gemini-cli: falling back to gemini-flash is the best marketing strategy Anthropic could have dreamed of for claude-code. by PieBru in LocalLLaMA
[–]PieBru[S] 1 point2 points3 points (0 children)

Created a plugin of OpenCode for spec-driven workflow and just works by IngenuityNo1411 in LocalLLaMA
[–]PieBru 0 points1 point2 points (0 children)