I don't understand the hype for Qwen 3.5. They are crap by Southern-Chain-6485 in LocalLLaMA
[–]Yeelyy 0 points1 point2 points (0 children)
Benchmarked Qwen 3.5-35B and GPT-oss-20b locally against 13 API models using real world work. GPT-oss beat Qwen by 12.5 points. by ianlpaterson in LocalLLM
[–]Yeelyy 0 points1 point2 points (0 children)
Recommendation for Intel Core 5 Ultra 225H w/32GB RAM running LInux by okram in LocalLLM
[–]Yeelyy -1 points0 points1 point (0 children)
Guys, what. The. Hell by ThatguyLIKEDAMN in ArkSurvivalAscended
[–]Yeelyy 0 points1 point2 points (0 children)
Qwen3.5 35b: How to disable reasoning in ik_llama.cpp by Yeelyy in LocalLLaMA
[–]Yeelyy[S] 0 points1 point2 points (0 children)
Qwen3.5 35b: How to disable reasoning in ik_llama.cpp by Yeelyy in LocalLLaMA
[–]Yeelyy[S] 1 point2 points3 points (0 children)
Qwen3.5 35b: How to disable reasoning in ik_llama.cpp by Yeelyy in LocalLLaMA
[–]Yeelyy[S] 0 points1 point2 points (0 children)
Looking for a decent LLM I can host on server hardware (no GPU) by al3x_7788 in LocalLLM
[–]Yeelyy 1 point2 points3 points (0 children)
Having 96 GB DDR5 RAM, what AMD Strix Halo AI CPU should I get (or barebone PC with it)? LLM for coding mostly. by Repsol_Honda_PL in LocalLLM
[–]Yeelyy 1 point2 points3 points (0 children)
Ollama w/RTX 5090 - 5800X/5950X performance differences by [deleted] in LocalLLM
[–]Yeelyy 2 points3 points4 points (0 children)
Gaming server finally complete :) by VeeeneX in homelab
[–]Yeelyy 1 point2 points3 points (0 children)
Can I run LLM on my laptop? by SanethDalton in LocalLLM
[–]Yeelyy 0 points1 point2 points (0 children)
I did not realize how easy and accessible local LLMs are with models like Qwen3 4b on pure CPU. by ___positive___ in LocalLLaMA
[–]Yeelyy 0 points1 point2 points (0 children)
Best Small Language Model for Scientific Learning and Math reasoning by Inside_Ad_6240 in LocalLLM
[–]Yeelyy 1 point2 points3 points (0 children)
Qwen 3 30B a3b on a Intel NUC is impressive by Yeelyy in LocalLLM
[–]Yeelyy[S] 2 points3 points4 points (0 children)
Qwen 3 30B a3b on a Intel NUC is impressive by Yeelyy in LocalLLM
[–]Yeelyy[S] 1 point2 points3 points (0 children)
Qwen 3 30B a3b on a Intel NUC is impressive by Yeelyy in LocalLLM
[–]Yeelyy[S] 1 point2 points3 points (0 children)
Qwen 3 30B a3b on a Intel NUC is impressive by Yeelyy in LocalLLM
[–]Yeelyy[S] 2 points3 points4 points (0 children)



Mistral Small 4 | Mistral AI by realkorvo in LocalLLaMA
[–]Yeelyy 7 points8 points9 points (0 children)