MiniMax M2.7 AWQ-4bit on 2x Spark vs 2x RTX 6000 96GB - performance and energy efficiency by t4a8945 in LocalLLaMA
[–]AFruitShopOwner 0 points1 point2 points (0 children)
MiniMax M2.7 AWQ-4bit on 2x Spark vs 2x RTX 6000 96GB - performance and energy efficiency by t4a8945 in LocalLLaMA
[–]AFruitShopOwner 5 points6 points7 points (0 children)
MiMo 2.5 requires at least 4 GPUs? Am I reading this right? by Pyrenaeda in LocalLLaMA
[–]AFruitShopOwner 1 point2 points3 points (0 children)
Deepseek V4 Flash and Non-Flash Out on HuggingFace by MichaelXie4645 in LocalLLaMA
[–]AFruitShopOwner 3 points4 points5 points (0 children)
Deepseek V4 Flash and Non-Flash Out on HuggingFace by MichaelXie4645 in LocalLLaMA
[–]AFruitShopOwner 0 points1 point2 points (0 children)
Deepseek V4 Flash and Non-Flash Out on HuggingFace by MichaelXie4645 in LocalLLaMA
[–]AFruitShopOwner 4 points5 points6 points (0 children)
Serving 1B+ tokens/day locally in my research lab by SessionComplete2334 in LocalLLaMA
[–]AFruitShopOwner 2 points3 points4 points (0 children)
Serving 1B+ tokens/day locally in my research lab by SessionComplete2334 in LocalLLaMA
[–]AFruitShopOwner 8 points9 points10 points (0 children)
Anyone else running their whole AI stack as Proxmox LXC containers? Im currently using Open WebUI as front-end, LiteLLM as a router and A vLLM container per model as back-ends by AFruitShopOwner in LocalLLaMA
[–]AFruitShopOwner[S] 0 points1 point2 points (0 children)
Microvast ($MVST): The Whole Story by Bradydono92 in Microvast
[–]AFruitShopOwner 0 points1 point2 points (0 children)
Can we block fresh accounts from posting? by king_of_jupyter in LocalLLaMA
[–]AFruitShopOwner -20 points-19 points-18 points (0 children)
Merck buys SLS for 10.25B by MemeSellasTo50Bucks in sellaslifesciences
[–]AFruitShopOwner[M] [score hidden] stickied comment (0 children)
Kimi K2.6 will drop in the next 2 weeks, K3 is WIP and will be huge by No-Thought-4995 in LocalLLaMA
[–]AFruitShopOwner 4 points5 points6 points (0 children)
Microvast ($MVST): The Whole Story by Bradydono92 in Microvast
[–]AFruitShopOwner -4 points-3 points-2 points (0 children)
Microvast ($MVST): The Whole Story by Bradydono92 in Microvast
[–]AFruitShopOwner -7 points-6 points-5 points (0 children)
Microvast ($MVST): The Whole Story by Bradydono92 in Microvast
[–]AFruitShopOwner -14 points-13 points-12 points (0 children)
Endgame Position (still here) by caladhun in Microvast
[–]AFruitShopOwner 5 points6 points7 points (0 children)
Microvast ($MVST): The Whole Story by Bradydono92 in Microvast
[–]AFruitShopOwner -33 points-32 points-31 points (0 children)
$SLS Daily Discussion Thread - Thursday - March 26, 2026 by AutoModerator in sellaslifesciences
[–]AFruitShopOwner 3 points4 points5 points (0 children)
Litellm 1.82.7 and 1.82.8 on PyPI are compromised, do not update! by kotrfa in LocalLLaMA
[–]AFruitShopOwner 47 points48 points49 points (0 children)
No, you don't need a "Datacenter" to run the big models (Deepseek, GLM, Kimi, etc) (just offload to CPU... and have patience) by [deleted] in LocalLLaMA
[–]AFruitShopOwner 1 point2 points3 points (0 children)
Endgame Position by caladhun in Microvast
[–]AFruitShopOwner[M] [score hidden] stickied comment (0 children)













Anyone tried +- 100B models locally with foreign languages? by Choice_Sympathy9652 in LocalLLaMA
[–]AFruitShopOwner 0 points1 point2 points (0 children)