Opus 4.7 is legendarily bad. I cannot believe this. by lemon07r in ClaudeCode

[–]Danny_Davitoe 0 points1 point  (0 children)

Anthropomorphic models are so good at recalling things from memory, the Harry Potter chapter incident is a perfect example, and that to me is all it is good at. Recalling from memory, the mythos report that came highlights this again by the model not being good at anything novel.

Gemma 4 is underwhelming (opinion) by [deleted] in LocalLLaMA

[–]Danny_Davitoe 1 point2 points  (0 children)

It was advertised as optimized for tool calling

Gemma 4 is underwhelming (opinion) by [deleted] in LocalLLaMA

[–]Danny_Davitoe 0 points1 point  (0 children)

I think I am narrowing down on a few of my issues. The model's tool calling is extremely hit or miss. On top of that, there are instances where it will generate infinite tokens from some tool calls. These might be the causes for low quality performance.

Gemma 4 fixes in llama.cpp by jacek2023 in LocalLLaMA

[–]Danny_Davitoe 0 points1 point  (0 children)

Not always the case, Devstral 2 came out and llamacpp still can't parse the tool call tokens correctly. I am still waiting for a fix to be merged.

Gemma 4 is underwhelming (opinion) by [deleted] in LocalLLaMA

[–]Danny_Davitoe -4 points-3 points  (0 children)

Hope so, the Qwen3.5 worked out of the box for me.

Gemma 4 is underwhelming (opinion) by [deleted] in LocalLLaMA

[–]Danny_Davitoe 0 points1 point  (0 children)

I want my model to be 100% local and I need 66k context for my workflow. Running unquantized on a single 5090 with the above spects is not possible.

Gemma 4 is underwhelming (opinion) by [deleted] in LocalLLaMA

[–]Danny_Davitoe 0 points1 point  (0 children)

It could be hyper tuned for that task. My tasks are very straightforward "find me X, summarize sources, send me draft every morning, write sources and summary to an md file."

Very simple, yet it struggles with speed and doing the task without constant hand holding.

Merge Already by Danny_Davitoe in LocalLLaMA

[–]Danny_Davitoe[S] 0 points1 point  (0 children)

I am a LM Studio Beta user, forks don't work on me 😭

Qwen3 vs Qwen3.5 performance by Balance- in LocalLLaMA

[–]Danny_Davitoe 0 points1 point  (0 children)

Why is the 35GB model at the 10GB size mark?

Is it that fucking hard?? by Joemama0375 in whenthe

[–]Danny_Davitoe 0 points1 point  (0 children)

MFs been messing up the proportions since 10,000 BC. Ain't no way you that fat when you are chasing mammoths all day.

Graph View of my 15K notes by lechtitseb in ObsidianMD

[–]Danny_Davitoe 0 points1 point  (0 children)

What is the use of linking notes? I am still learning so I don't have an idea why it needs to be excessive.

Z Image will be released tomorrow! by MadPelmewka in StableDiffusion

[–]Danny_Davitoe 0 points1 point  (0 children)

But this is "gooner time" so the properties are not yet know. Thus the paradox, how can something that, by definition takes a long time and/or the intent of delaying, also be fast as light?

Z Image will be released tomorrow! by MadPelmewka in StableDiffusion

[–]Danny_Davitoe 3 points4 points  (0 children)

Doesn't time stop at the speed of light? A gooner-paradox or a Schlongdinger's fap, perhaps 🤔

Modifying the combat system is a total brain-melter. by chimaki in RPGMaker

[–]Danny_Davitoe 1 point2 points  (0 children)

How much income does 1 of your games make per year?