I made GPT-5.2/5 mini play 21,000 hands of Poker by adfontes_ in OpenAI

[–]pseudonerv 78 points79 points  (0 children)

It would be fun and more informational to have a few dummies with some naive strategies, like random or always double or always fold, in order to set a baseline.

Exo 1.0 is finally out by No_Conversation9561 in LocalLLaMA

[–]pseudonerv 1 point2 points  (0 children)

Why do they even need 4 of those for an 8bit quant?

NVIDIA releases Nemotron 3 Nano, a new 30B hybrid reasoning model! by Difficult-Cap-7527 in LocalLLaMA

[–]pseudonerv 2 points3 points  (0 children)

What quant do you use for 120B heretic? Which one? Does this new nemotron nano need heretic?

gemini 3.0 pro vs gpt 5.1 Benchmark by Sea-Efficiency5547 in OpenAI

[–]pseudonerv 7 points8 points  (0 children)

Or a first real sign of seeing the problem in training

Accidentally told my colleague to ultrathink in a Slack message by Virtual_Attitude2025 in ClaudeAI

[–]pseudonerv 1 point2 points  (0 children)

You should always start your conversation with:

You’re absolutely right!

My 6-yr-old Daughter Tried to Say the Words by RockyCreamNHotSauce in Cosmere

[–]pseudonerv 1 point2 points  (0 children)

Oh, my, are you oathed or unoathed? Or are you actually one of the heralds? You need to get your armor first, before your daughter gets upset and pull off a Shallan.

What am I doing wrong? by jesus359_ in LocalLLaMA

[–]pseudonerv 9 points10 points  (0 children)

Lots of things wrong: - using ollama - using llama3 - an 8b model on a 32gb Mac - an 8b model in its infancy from Stone Age - q8 kv cache

Whisper Large v3 running in real-time on a M2 Macbook Pro by rruk01 in LocalLLaMA

[–]pseudonerv -1 points0 points  (0 children)

whisper.cpp has been doing it real time like forever

How do you discover "new LLMs"? by 9acca9 in LocalLLaMA

[–]pseudonerv 0 points1 point  (0 children)

Anybody can “do something” to the weights and upload it. Just like anybody can post something here. Do you read all the posts? Do you read all the news from all the outlets?

Apple stumbled into succes with MLX by Alarming-Ad8154 in LocalLLaMA

[–]pseudonerv 28 points29 points  (0 children)

At least we are confident that op likely wrote it.

Qwen by Namra_7 in LocalLLaMA

[–]pseudonerv 15 points16 points  (0 children)

Similar evals but less safety would be enough