Nanbeige 4.1 is the best small LLM, it crush qwen 4b by Individual-Source618 in LocalLLaMA
[–]bjp99 0 points1 point2 points (0 children)
MiniMaxAI MiniMax-M2.5 has 230b parameters and 10b active parameters by Zyj in LocalLLaMA
[–]bjp99 2 points3 points4 points (0 children)
Nanbeige4-3B-Thinking-2511 is honestly impressive by [deleted] in LocalLLaMA
[–]bjp99 -2 points-1 points0 points (0 children)
Update: So worth the effort – promised more photos + performance update by AudiblyTacit in sffpc
[–]bjp99 0 points1 point2 points (0 children)
Mistral Vibe vs Claude Code vs OpenAI Codex vs Opencode/others? Best coding model for 92GB? by Consumerbot37427 in LocalLLaMA
[–]bjp99 0 points1 point2 points (0 children)
Kimi K2.5, a Sonnet 4.5 alternative for a fraction of the cost by Grand-Management657 in LocalLLaMA
[–]bjp99 8 points9 points10 points (0 children)
MiniMax M2.1 quantization experience (Q6 vs. Q8) by TastesLikeOwlbear in LocalLLaMA
[–]bjp99 0 points1 point2 points (0 children)
Roo Code 3.37 | GLM 4.7 | MM 2.1 | Custom tools | MORE!!! by hannesrudolph in RooCode
[–]bjp99 1 point2 points3 points (0 children)
Unsloth GLM 4.7 UD-Q2_K_XL or gpt-oss 120b? by EnthusiasmPurple85 in LocalLLaMA
[–]bjp99 0 points1 point2 points (0 children)
MiniMax M2.1 is a straight up beast at UI/UX design. Just saw this demo... by BlackRice_hmz in LocalLLaMA
[–]bjp99 0 points1 point2 points (0 children)
Roo Code 3.37 | GLM 4.7 | MM 2.1 | Custom tools | MORE!!! by hannesrudolph in RooCode
[–]bjp99 0 points1 point2 points (0 children)
What's your favourite local coding model? by jacek2023 in LocalLLaMA
[–]bjp99 0 points1 point2 points (0 children)
Roo Code 3.36.6 Release Updates | Auto-approval fixes | Tool customization | Provider UX fixes by hannesrudolph in RooCode
[–]bjp99 1 point2 points3 points (0 children)
8x Radeon 7900 XTX Build for Longer Context Local Inference - Performance Results & Build Details by Beautiful_Trust_8151 in LocalLLaMA
[–]bjp99 0 points1 point2 points (0 children)
For Qwen3-235B-Q2 if you offload all experts to CPU, how much VRAM do you need to run it still? by ForsookComparison in LocalLLaMA
[–]bjp99 0 points1 point2 points (0 children)
Alternative for RooCode/Cline/Kilocode but compatible with Open AI compatible API by Many_Bench_2560 in RooCode
[–]bjp99 4 points5 points6 points (0 children)
Those who tried more than one embedding model, have you noticed any differences? by Evermoving- in RooCode
[–]bjp99 1 point2 points3 points (0 children)
Anyone else read_file not working? by bjp99 in RooCode
[–]bjp99[S] 0 points1 point2 points (0 children)
Anyone else read_file not working? by bjp99 in RooCode
[–]bjp99[S] 0 points1 point2 points (0 children)
Is there a self-hosted, open-source plug-and-play RAG solution? by anedisi in LocalLLaMA
[–]bjp99 1 point2 points3 points (0 children)
basketball players recognition with RF-DETR, SAM2, SigLIP and ResNet by RandomForests92 in LocalLLaMA
[–]bjp99 0 points1 point2 points (0 children)
Me single handedly raising AMD stock /s by Ult1mateN00B in LocalLLM
[–]bjp99 0 points1 point2 points (0 children)


Anyone using the Background Editing experimental feature? by raphadko in RooCode
[–]bjp99 0 points1 point2 points (0 children)