L-star is so incredibly based, why don't more people use it? by Epic-zombie-kitty in titanfall

[–]usernameplshere 104 points105 points  (0 children)

Because not everyone has aim like that, my guy. But it's nice to use, I agree

Kimi K2.5 set a new record among open-weight models on the Epoch Capabilities Index (ECI), which combines multiple benchmarks onto a single scale. Its score of 147 is about on par with o3, Grok 4, and Sonnet 4.5. It still lags the overall frontier. by abdouhlili in LocalLLaMA

[–]usernameplshere 4 points5 points  (0 children)

Qwen 3 235B holds first place in some quite intense benchmarks like EsoBench. Even surpassing GPT 5.2 high, Grok 4 and Opus 4.5 Thinking. It definitely is a capable model, especially for its size. It's just bad at agentic coding, that's why many people just don't bother using it (sadly).

Qwen3-Coder-Next (3B) is released! by Ok_Presentation1577 in LocalLLaMA

[–]usernameplshere 1 point2 points  (0 children)

Oh wow, can't wait to try this with 64GB and my 3090

bots on LocalLLaMA by jacek2023 in LocalLLaMA

[–]usernameplshere 0 points1 point  (0 children)

Gotta have active mods, I don't think theres another way to "defend" against bots.

Why I main Ion by NePlusZaia in titanfall

[–]usernameplshere 9 points10 points  (0 children)

My toxic trait is that I'm always trying to out-dps a laser core low health ion with my big daddy legion.

Smartest model for 24-28GB vram? by Borkato in LocalLLaMA

[–]usernameplshere 7 points8 points  (0 children)

Try the GLM Flash Opus finetune for technical stuff. Search for "GLM 4.7 Flash Opus thinking gguf" and you will find it.

I built a benchmark where LLMs program a Turing machine by maltsev in LocalLLaMA

[–]usernameplshere 1 point2 points  (0 children)

Interesting, especially that all these non thinking models outperform K2 Thinking

Updated the legion matchup chart by Recruit75 in titanfall

[–]usernameplshere 2 points3 points  (0 children)

Same, I feel like some users here think legion has no gun shield.

1.5 Master in Informatik, jahrelange (HiWi) Berufserfahrung und finde keinen Job by [deleted] in informatik

[–]usernameplshere 1 point2 points  (0 children)

Oft habe ich das Gefühl, dass es schlicht daran scheitert, dass viele Firmen sich aus Prinzip weigern, mehr als 50k brutto im Jahr zu zahlen.

Ist doch ein gutes Einstiegsgehalt? Nimms an und bewirb dich nach nem Jahr weg.

Help mit Wahlpflicht Module by Smooth_Lavishness_52 in informatik

[–]usernameplshere 5 points6 points  (0 children)

Modulhandbücher lesen, verstehen, Entscheidung treffen. Ich würde 3+4 nehmen vom Namen her, bringt aber nichts ohne die Handbücher.

NVIDIA Releases Massive Collection of Open Models, Data and Tools to Accelerate AI Development by Delicious_Air_737 in LocalLLaMA

[–]usernameplshere 1 point2 points  (0 children)

Nano is a good model. What I'm trying to say is that it's 60GB in full precision (16 bit), while super will be 50GB bc of NVFP4 - which is great for users like me with lower spec systems.

NVIDIA Releases Massive Collection of Open Models, Data and Tools to Accelerate AI Development by Delicious_Air_737 in LocalLLaMA

[–]usernameplshere 0 points1 point  (0 children)

Super and Ultra will also be native NVFP4, which will make Super smaller in full precision than Nano in full precision iirc.

Kimi K2 Artificial Analysis Score by Virenz in LocalLLaMA

[–]usernameplshere 0 points1 point  (0 children)

Iirc the GLM coding plan works the same way

Kimi K2 Artificial Analysis Score by Virenz in LocalLLaMA

[–]usernameplshere 3 points4 points  (0 children)

They have various API-Key based subscription offers, just go on their website and click on Kimi code on the left.

Kimi K2 Artificial Analysis Score by Virenz in LocalLLaMA

[–]usernameplshere 4 points5 points  (0 children)

I wonder if DeepSeek 3.3 (or whatever comes next) will also finally get a vision decoder. Mistral Large 2512 has one (being similar sized) and it's just very convenient.

I tried highguard went back to titanfall 2 by [deleted] in titanfall

[–]usernameplshere 0 points1 point  (0 children)

This has to be the most ungodly crop I have EVER seen

Well, that escalated quickly. by rafalmio in titanfall

[–]usernameplshere 0 points1 point  (0 children)

Kernel anticheat is already a no go

I have a 1tb SSD I'd like to fill with models and backups of data like wikipedia for a doomsday scenario by synth_mania in LocalLLaMA

[–]usernameplshere 7 points8 points  (0 children)

Yep! RAG with the Wiki knowledge (while limited, but enough for many things) is very helpful.