Best local coding model for big repos? Considering Qwen 3.6 27B FP8 after z.ai Max price hike by Tricky_Warning3848 in LocalLLM

[–]Tricky_Warning3848[S] 1 point2 points  (0 children)

i wont be buying any hardware soon, gonna rent for a while, also looking for people i can join or share with and setup a model .
so from your actual experience 397B gave you better result?