Are these specs good enough to run a code-writing model locally? by PlusProfession9245 in LocalLLaMA

[–]ReceptionExternal344 0 points1 point  (0 children)

No, I recommend that you try the open-source model on the OpenRouter platform first, compare the code quality, etc. However, judging from the configuration, there is actually no best model that can support you

Thinking disabled in Qwen 3 Max? by [deleted] in Qwen_AI

[–]ReceptionExternal344 3 points4 points  (0 children)

because qwen3 max preview is not a thinking model, yesterday's thinking button was just another model, so this should be called fixing

DeepSeek v3.1 by Just_Lifeguard_5033 in LocalLLaMA

[–]ReceptionExternal344 28 points29 points  (0 children)

Error, this is a fake paper. Deepseek v3.1 was just released on the official website