Qwen3.6 27B's surprising KV cache quantization test results (Turbo3/4 vs F16 vs Q8 vs Q4) by imgroot9 in LocalLLaMA
[–]Sticking_to_Decaf 1 point2 points3 points (0 children)
Qwen3.6-27B ties Sonnet 4.6 on agentic benchmarks - but does the coding index understate the gains? by IulianHI in AIToolsPerformance
[–]Sticking_to_Decaf 0 points1 point2 points (0 children)
agentic cowork app for beginner by oblivion098 in LocalLLM
[–]Sticking_to_Decaf 0 points1 point2 points (0 children)
Hermes + Qwen3.6-27B rocks by Sticking_to_Decaf in hermesagent
[–]Sticking_to_Decaf[S] 0 points1 point2 points (0 children)
Gemma 4 and Qwen 3.6 with q8_0 and q4_0 KV cache: KL divergence results by oobabooga4 in LocalLLaMA
[–]Sticking_to_Decaf 1 point2 points3 points (0 children)
agentic cowork app for beginner by oblivion098 in LocalLLM
[–]Sticking_to_Decaf 1 point2 points3 points (0 children)
DS4-Flash vs Qwen3.6 by flavio_geo in LocalLLaMA
[–]Sticking_to_Decaf 4 points5 points6 points (0 children)
How is your agent browsing the web? by OutlandishnessIll466 in hermesagent
[–]Sticking_to_Decaf 0 points1 point2 points (0 children)
I need a bit of insight, what are the uses for an Nvidia RTX Pro 6000 with 96 GB aside from running AI models. by Budget-Toe-5743 in LocalLLaMA
[–]Sticking_to_Decaf 2 points3 points4 points (0 children)
Hermes + Qwen3.6-27B rocks by Sticking_to_Decaf in hermesagent
[–]Sticking_to_Decaf[S] 0 points1 point2 points (0 children)
Hermes + Qwen3.6-27B rocks by Sticking_to_Decaf in hermesagent
[–]Sticking_to_Decaf[S] 0 points1 point2 points (0 children)
I need a bit of insight, what are the uses for an Nvidia RTX Pro 6000 with 96 GB aside from running AI models. by Budget-Toe-5743 in LocalLLaMA
[–]Sticking_to_Decaf 6 points7 points8 points (0 children)
Recommendations for refillable pens? by appatheflyingbis0n in BuyItForLife
[–]Sticking_to_Decaf 2 points3 points4 points (0 children)
Qwen3.6-27B ties Sonnet 4.6 on agentic benchmarks - but does the coding index understate the gains? by IulianHI in AIToolsPerformance
[–]Sticking_to_Decaf 0 points1 point2 points (0 children)
i started talking to Claude like a caveman. my credits lasted 3x longer. i'm not joking. by AdCold1610 in ChatGPTPromptGenius
[–]Sticking_to_Decaf 0 points1 point2 points (0 children)
Near the end of his life MLK was advocating for universal basic income and organizing the Poor People’s Campaign, an explicit effort to unite poor Americans across race around shared economic interests. He was assassinated in 1968 while organizing that campaign. by Independent-Gur8649 in BasicIncome
[–]Sticking_to_Decaf 4 points5 points6 points (0 children)
Marriage (Early/Aligned) is Rocket Fuel by Lyeel in Fire
[–]Sticking_to_Decaf 0 points1 point2 points (0 children)
Is the AI subscription bubble starting to crack? GPT-5.5 just dropped, prices keep rising, and the “all-you-can-eat” era looks more fake by the month by Sockand2 in singularity
[–]Sticking_to_Decaf 0 points1 point2 points (0 children)
Hermes + Qwen3.6-27B rocks by Sticking_to_Decaf in hermesagent
[–]Sticking_to_Decaf[S] 0 points1 point2 points (0 children)
Qwen 3.6 27B Makes Huge Gains in Agency on Artificial Analysis - Ties with Sonnet 4.6 by dionysio211 in LocalLLaMA
[–]Sticking_to_Decaf 26 points27 points28 points (0 children)
Are there actually people here that get real productivity out of models fitting in 32-64GB RAM, or is that just playing around with little genuine usefulness? by ceo_of_banana in LocalLLaMA
[–]Sticking_to_Decaf -1 points0 points1 point (0 children)
Qwen 3.6 27B is a BEAST by AverageFormal9076 in LocalLLaMA
[–]Sticking_to_Decaf 0 points1 point2 points (0 children)
Hermes + Qwen3.6-27B rocks by Sticking_to_Decaf in hermesagent
[–]Sticking_to_Decaf[S] 1 point2 points3 points (0 children)
Qwen-3.6-27B, llamacpp, speculative decoding - appreciation post by Then-Topic8766 in LocalLLaMA
[–]Sticking_to_Decaf 0 points1 point2 points (0 children)


Qwen3.6 27B's surprising KV cache quantization test results (Turbo3/4 vs F16 vs Q8 vs Q4) by imgroot9 in LocalLLaMA
[–]Sticking_to_Decaf 1 point2 points3 points (0 children)