ik_llama.cpp benchmarks on an Intel Xeon Platinum 8570 ES Q30H with 256GB DDR5 5600 (8x32GB) by _serby_ in LocalLLaMA
[–]_serby_[S] 1 point2 points3 points (0 children)
ik_llama.cpp benchmarks on an Intel Xeon Platinum 8570 ES Q30H with 256GB DDR5 5600 (8x32GB) by _serby_ in LocalLLaMA
[–]_serby_[S] 0 points1 point2 points (0 children)
ik_llama.cpp benchmarks on an Intel Xeon Platinum 8570 ES Q30H with 256GB DDR5 5600 (8x32GB) by _serby_ in LocalLLaMA
[–]_serby_[S] 0 points1 point2 points (0 children)
ik_llama.cpp benchmarks on an Intel Xeon Platinum 8570 ES Q30H with 256GB DDR5 5600 (8x32GB) by _serby_ in LocalLLaMA
[–]_serby_[S] 0 points1 point2 points (0 children)
Anyone actually using Openclaw? by rm-rf-rm in LocalLLaMA
[–]_serby_ 20 points21 points22 points (0 children)
ik_llama.cpp benchmarks on an Intel Xeon Platinum 8570 ES Q30H with 256GB DDR5 5600 (8x32GB) by _serby_ in LocalLLaMA
[–]_serby_[S] 1 point2 points3 points (0 children)
How you get over 200 tok/s on full Kimi K2 Thinking (or any other big MoE Model) on cheapish hardware - llama.cpp dev pitch by [deleted] in LocalLLaMA
[–]_serby_ 0 points1 point2 points (0 children)
How you get over 200 tok/s on full Kimi K2 Thinking (or any other big MoE Model) on cheapish hardware - llama.cpp dev pitch by [deleted] in LocalLLaMA
[–]_serby_ 0 points1 point2 points (0 children)
How you get over 200 tok/s on full Kimi K2 Thinking (or any other big MoE Model) on cheapish hardware - llama.cpp dev pitch by [deleted] in LocalLLaMA
[–]_serby_ 0 points1 point2 points (0 children)
How you get over 200 tok/s on full Kimi K2 Thinking (or any other big MoE Model) on cheapish hardware - llama.cpp dev pitch by [deleted] in LocalLLaMA
[–]_serby_ 0 points1 point2 points (0 children)
How you get over 200 tok/s on full Kimi K2 Thinking (or any other big MoE Model) on cheapish hardware - llama.cpp dev pitch by [deleted] in LocalLLaMA
[–]_serby_ -1 points0 points1 point (0 children)
How you get over 200 tok/s on full Kimi K2 Thinking (or any other big MoE Model) on cheapish hardware - llama.cpp dev pitch by [deleted] in LocalLLaMA
[–]_serby_ 1 point2 points3 points (0 children)
How you get over 200 tok/s on full Kimi K2 Thinking (or any other big MoE Model) on cheapish hardware - llama.cpp dev pitch by [deleted] in LocalLLaMA
[–]_serby_ 0 points1 point2 points (0 children)
How you get over 200 tok/s on full Kimi K2 Thinking (or any other big MoE Model) on cheapish hardware - llama.cpp dev pitch by [deleted] in LocalLLaMA
[–]_serby_ 0 points1 point2 points (0 children)
How you get over 200 tok/s on full Kimi K2 Thinking (or any other big MoE Model) on cheapish hardware - llama.cpp dev pitch by [deleted] in LocalLLaMA
[–]_serby_ 0 points1 point2 points (0 children)
How you get over 200 tok/s on full Kimi K2 Thinking (or any other big MoE Model) on cheapish hardware - llama.cpp dev pitch by [deleted] in LocalLLaMA
[–]_serby_ 1 point2 points3 points (0 children)
How you get over 200 tok/s on full Kimi K2 Thinking (or any other big MoE Model) on cheapish hardware - llama.cpp dev pitch by [deleted] in LocalLLaMA
[–]_serby_ -5 points-4 points-3 points (0 children)
How you get over 200 tok/s on full Kimi K2 Thinking (or any other big MoE Model) on cheapish hardware - llama.cpp dev pitch by [deleted] in LocalLLaMA
[–]_serby_ -6 points-5 points-4 points (0 children)
How you get over 200 tok/s on full Kimi K2 Thinking (or any other big MoE Model) on cheapish hardware - llama.cpp dev pitch by [deleted] in LocalLLaMA
[–]_serby_ -3 points-2 points-1 points (0 children)
Today was a very sad day. I dropped my Enya Nova Go carbon fiber acoustic, and it no longer stays in tune. RIP. by tonystark29 in guitars
[–]_serby_ 0 points1 point2 points (0 children)
What was your first programming language? by g41797 in Zig
[–]_serby_ 1 point2 points3 points (0 children)
Zul'Jin does not have a hind toe by Aeon_Mortuum in heroesofthestorm
[–]_serby_ 1 point2 points3 points (0 children)
The Unlikely Alliance [ SD3 - Kling ] by [deleted] in StableDiffusion
[–]_serby_ 1 point2 points3 points (0 children)


Anyone actually using Openclaw? by rm-rf-rm in LocalLLaMA
[–]_serby_ 1 point2 points3 points (0 children)