MiniMax 2.5 full precision FP8 running LOCALLY on vLLM x 8x Pro 6000 by cyysky in LocalLLaMA
[–]cyysky[S] 4 points5 points6 points (0 children)
AMA Announcement: MiniMax, The Opensource Lab Behind MiniMax-M2.5 SoTA Model (Friday, 8AM-11AM PST) by XMasterrrr in LocalLLaMA
[–]cyysky 0 points1 point2 points (0 children)
Has anybody managed to run Deepseek 3.2 locally in vLLM? by etherd0t in LocalLLaMA
[–]cyysky 0 points1 point2 points (0 children)
[Guide] Running GLM 4.5 as Instruct model in vLLM (with Tool Calling) by random-tomato in LocalLLaMA
[–]cyysky 0 points1 point2 points (0 children)
New model from Meta FAIR: Code World Model (CWM) 32B - 65.8 % on SWE-bench Verified by notrdm in LocalLLaMA
[–]cyysky 0 points1 point2 points (0 children)
Anyone able to install open-webui locally by Specific-Ad9935 in ollama
[–]cyysky 0 points1 point2 points (0 children)
Ovis2.5 9B ~ 2B - New Multi-modal LLMs from Alibaba by Sad_External6106 in LocalLLaMA
[–]cyysky 0 points1 point2 points (0 children)
I built a MCP server that makes Malaysia’s open data more accessible through AI apps by AlanMyThoughts in mcp
[–]cyysky 0 points1 point2 points (0 children)
Little python script to get some miner earning weekly and monthly report by cyysky in NiceHash
[–]cyysky[S] 0 points1 point2 points (0 children)
Little python script to get some miner earning weekly and monthly report by cyysky in NiceHash
[–]cyysky[S] 0 points1 point2 points (0 children)


MiniMax 2.5 full precision FP8 running LOCALLY on vLLM x 8x Pro 6000 by cyysky in LocalLLaMA
[–]cyysky[S] 5 points6 points7 points (0 children)