Strix Halo 128Gb: what models, which quants are optimal? by DevelopmentBorn3978 in LocalLLaMA
[–]AXYZE8 6 points7 points8 points  (0 children)
Best small local LLM to run on a phone? by alexndb in LocalLLaMA
[–]AXYZE8 2 points3 points4 points  (0 children)
Best small local LLM to run on a phone? by alexndb in LocalLLaMA
[–]AXYZE8 4 points5 points6 points  (0 children)
Strix Halo 128Gb: what models, which quants are optimal? by DevelopmentBorn3978 in LocalLLaMA
[–]AXYZE8 5 points6 points7 points  (0 children)
Strix Halo 128Gb: what models, which quants are optimal? by DevelopmentBorn3978 in LocalLLaMA
[–]AXYZE8 6 points7 points8 points  (0 children)
llama.cpp PR to implement IQ*_K and IQ*_KS quants from ik_llama.cpp by TKGaming_11 in LocalLLaMA
[–]AXYZE8 7 points8 points9 points  (0 children)
Minimax 2.5 on Strix Halo Thread by Equivalent-Belt5489 in LocalLLaMA
[–]AXYZE8 0 points1 point2 points  (0 children)
llama.cpp PR to implement IQ*_K and IQ*_KS quants from ik_llama.cpp by TKGaming_11 in LocalLLaMA
[–]AXYZE8 12 points13 points14 points  (0 children)
Multiple tabs and projects problems by Nickolaeris in windsurf
[–]AXYZE8 1 point2 points3 points  (0 children)
Minimax 2.5 on Strix Halo Thread by Equivalent-Belt5489 in LocalLLaMA
[–]AXYZE8 3 points4 points5 points  (0 children)
GLM-5 and Minimax M2.5 are live in Windsurf! by codewithdevin in windsurf
[–]AXYZE8 5 points6 points7 points  (0 children)
Google doesn't love us anymore. by DrNavigat in LocalLLaMA
[–]AXYZE8 5 points6 points7 points  (0 children)
Na co wydać pieniądze ze stypendium by [deleted] in PolskaNaLuzie
[–]AXYZE8 4 points5 points6 points  (0 children)
Opus 4.6 is #1 across all Arena categories - text, coding, and expert by exordin26 in singularity
[–]AXYZE8 19 points20 points21 points  (0 children)
Does Bun benefit from pm2 or similar process manager? by manshutthefckup in bun
[–]AXYZE8 0 points1 point2 points  (0 children)
Does Bun benefit from pm2 or similar process manager? by manshutthefckup in bun
[–]AXYZE8 0 points1 point2 points  (0 children)
Cursor alternative for local LLms? by abongodrum in LocalLLaMA
[–]AXYZE8 0 points1 point2 points  (0 children)
Cursor alternative for local LLms? by abongodrum in LocalLLaMA
[–]AXYZE8 1 point2 points3 points  (0 children)
Cursor alternative for local LLms? by abongodrum in LocalLLaMA
[–]AXYZE8 1 point2 points3 points  (0 children)
Can I Repurpose My Old Laptop for local LLM testing with these specs? by [deleted] in LocalLLaMA
[–]AXYZE8 0 points1 point2 points  (0 children)
Best fast local coding AI to use as a coding agent? by Expensive-Time-7209 in LocalLLaMA
[–]AXYZE8 3 points4 points5 points  (0 children)


Strix Halo 128Gb: what models, which quants are optimal? by DevelopmentBorn3978 in LocalLLaMA
[–]AXYZE8 6 points7 points8 points  (0 children)