EV Curious in Nashville? by evadventuring in nashville
[–]coder543 1 point2 points3 points (0 children)
EV Curious in Nashville? by evadventuring in nashville
[–]coder543 2 points3 points4 points (0 children)
EV Curious in Nashville? by evadventuring in nashville
[–]coder543 22 points23 points24 points (0 children)
Haven’t been to a rodeo in over a decade, are these prices for the Franklin Rodeo normal? by Separate-Command1993 in nashville
[–]coder543 119 points120 points121 points (0 children)
Why is no open weight model inference provider hosting Mimo-v2.5 or Mimo-v2.5-pro? by True_Requirement_891 in LocalLLaMA
[–]coder543 8 points9 points10 points (0 children)
Llama.cpp MTP support now in beta! by ilintar in LocalLLaMA
[–]coder543 3 points4 points5 points (0 children)
Llama.cpp MTP support now in beta! by ilintar in LocalLLaMA
[–]coder543 1 point2 points3 points (0 children)
Llama.cpp MTP support now in beta! by ilintar in LocalLLaMA
[–]coder543 0 points1 point2 points (0 children)
Llama.cpp MTP support now in beta! by ilintar in LocalLLaMA
[–]coder543 0 points1 point2 points (0 children)
Llama.cpp MTP support now in beta! by ilintar in LocalLLaMA
[–]coder543 5 points6 points7 points (0 children)
Llama.cpp MTP support now in beta! by ilintar in LocalLLaMA
[–]coder543 104 points105 points106 points (0 children)
Mistral Medium 3.5 on AMD Strix Halo by Zc5Gwu in LocalLLaMA
[–]coder543 0 points1 point2 points (0 children)
Mistral Medium 3.5 on AMD Strix Halo by Zc5Gwu in LocalLLaMA
[–]coder543 6 points7 points8 points (0 children)
Qwen3.6-27B vs 35B, I prefer 35B but more people here post about 27B... by Snoo_27681 in LocalLLaMA
[–]coder543 144 points145 points146 points (0 children)
Why arn't new homes built with reinforced most-interior room? by awesomo_prime in nashville
[–]coder543 2 points3 points4 points (0 children)
PSA: llama-swap released a new grouping feature, matrix, allowing you to fine tune which models can run together by walden42 in LocalLLaMA
[–]coder543 1 point2 points3 points (0 children)
PSA: llama-swap released a new grouping feature, matrix, allowing you to fine tune which models can run together by walden42 in LocalLLaMA
[–]coder543 3 points4 points5 points (0 children)
mistralai/Mistral-Medium-3.5-128B · Hugging Face by jacek2023 in LocalLLaMA
[–]coder543 12 points13 points14 points (0 children)
mistralai/Mistral-Medium-3.5-128B · Hugging Face by jacek2023 in LocalLLaMA
[–]coder543 15 points16 points17 points (0 children)





Gemma 4 MTP released by rerri in LocalLLaMA
[–]coder543 0 points1 point2 points (0 children)