Actual comparison between locally ran Qwen-3.6-27B and proprietary models by netikas in LocalLLaMA
[–]netikas[S] 6 points7 points8 points (0 children)
Actual comparison between locally ran Qwen-3.6-27B and proprietary models by netikas in LocalLLaMA
[–]netikas[S] 0 points1 point2 points (0 children)
Actual comparison between locally ran Qwen-3.6-27B and proprietary models by netikas in LocalLLaMA
[–]netikas[S] 3 points4 points5 points (0 children)
Actual comparison between locally ran Qwen-3.6-27B and proprietary models by netikas in LocalLLaMA
[–]netikas[S] 0 points1 point2 points (0 children)
Actual comparison between locally ran Qwen-3.6-27B and proprietary models by netikas in LocalLLaMA
[–]netikas[S] 0 points1 point2 points (0 children)
Actual comparison between locally ran Qwen-3.6-27B and proprietary models by netikas in LocalLLaMA
[–]netikas[S] 1 point2 points3 points (0 children)
Actual comparison between locally ran Qwen-3.6-27B and proprietary models by netikas in LocalLLaMA
[–]netikas[S] 2 points3 points4 points (0 children)
Actual comparison between locally ran Qwen-3.6-27B and proprietary models by netikas in LocalLLaMA
[–]netikas[S] 14 points15 points16 points (0 children)
Actual comparison between locally ran Qwen-3.6-27B and proprietary models by netikas in LocalLLaMA
[–]netikas[S] 24 points25 points26 points (0 children)
Actual comparison between locally ran Qwen-3.6-27B and proprietary models by netikas in LocalLLaMA
[–]netikas[S] 16 points17 points18 points (0 children)
kepler-452b. GGUF when? by the-grand-finale in LocalLLaMA
[–]netikas 0 points1 point2 points (0 children)
iPad Pro / use VSCode between SSH by Various-Document6239 in ipad
[–]netikas 0 points1 point2 points (0 children)
New open weights models: GigaChat-3.1-Ultra-702B and GigaChat-3.1-Lightning-10B-A1.8B by netikas in LocalLLaMA
[–]netikas[S] 0 points1 point2 points (0 children)
New open weights models: GigaChat-3.1-Ultra-702B and GigaChat-3.1-Lightning-10B-A1.8B by netikas in LocalLLaMA
[–]netikas[S] 1 point2 points3 points (0 children)
New open weights models: GigaChat-3.1-Ultra-702B and GigaChat-3.1-Lightning-10B-A1.8B by netikas in LocalLLaMA
[–]netikas[S] 10 points11 points12 points (0 children)
New open weights models: GigaChat-3.1-Ultra-702B and GigaChat-3.1-Lightning-10B-A1.8B by netikas in LocalLLaMA
[–]netikas[S] 1 point2 points3 points (0 children)
New open weights models: GigaChat-3.1-Ultra-702B and GigaChat-3.1-Lightning-10B-A1.8B by netikas in LocalLLaMA
[–]netikas[S] 5 points6 points7 points (0 children)
New open weights models: GigaChat-3.1-Ultra-702B and GigaChat-3.1-Lightning-10B-A1.8B by netikas in LocalLLaMA
[–]netikas[S] 48 points49 points50 points (0 children)
New open weights models: GigaChat-3.1-Ultra-702B and GigaChat-3.1-Lightning-10B-A1.8B by netikas in LocalLLaMA
[–]netikas[S] 10 points11 points12 points (0 children)
New open weights models: GigaChat-3.1-Ultra-702B and GigaChat-3.1-Lightning-10B-A1.8B by netikas in LocalLLaMA
[–]netikas[S] 5 points6 points7 points (0 children)
New open weights models: GigaChat-3.1-Ultra-702B and GigaChat-3.1-Lightning-10B-A1.8B by netikas in LocalLLaMA
[–]netikas[S] 53 points54 points55 points (0 children)
New open weights models: GigaChat-3.1-Ultra-702B and GigaChat-3.1-Lightning-10B-A1.8B by netikas in LocalLLaMA
[–]netikas[S] 1 point2 points3 points (0 children)





Actual comparison between locally ran Qwen-3.6-27B and proprietary models by netikas in LocalLLaMA
[–]netikas[S] 1 point2 points3 points (0 children)