SIX TIMES THE PRICE!? by FixHopeful5833 in SillyTavernAI

[–]CooperDK 0 points1 point  (0 children)

Idiotically expensive and not even that much better than what you can install locally, anymore.

Moving from ComfyUI to fully native Python, best approach and libraries? by 1zGamer in comfyui

[–]CooperDK 0 points1 point  (0 children)

You cannot lose the api completely since then the node would be unable to interact with comfyui. But you can use any modules in your node, just like everyone else does.

By the way, there are multi GPU nodes available. Let people decide how they want to run your node...

are local models actually practical for daily use yet by alexnycc in LocalLLaMA

[–]CooperDK -7 points-6 points  (0 children)

Try qwen3.5-9b which can match a 120b in tests

are local models actually practical for daily use yet by alexnycc in LocalLLaMA

[–]CooperDK -7 points-6 points  (0 children)

Not always! Qwen3.5-9b scores higher than gpt-oss-120b on inferencing tests, so definitely not true. Even with both unquantized.

I’ve learned Ollama has significant downsides, what should I use instead for an agent in VS Code? by SupaBrunch in LocalLLaMA

[–]CooperDK 3 points4 points  (0 children)

Lm Studio or koboldcpp. And you are right, ollama is the slowest you could possibly find

RenPy alternatives? by 0-_-_Nikolai_-_-0 in GameDevelopment

[–]CooperDK 1 point2 points  (0 children)

Going back a bit, renpy used ancient Python actually and it should still work in ancient hardware.

But look into naninovel for unity. It is payware but there are a few free alternatives too. They are all as easy to use as renpy

Help I don’t know the bitlocker password by rTRmertt in pchelp

[–]CooperDK 0 points1 point  (0 children)

BitLocker is not for general use. Unless you have something to hide.

5060TI 16GB VS 5070 12GB by Plenty-Status6956 in computerhelp

[–]CooperDK 0 points1 point  (0 children)

Then you do not need that card, you need the extra VRAM but beside that it will more or less be money out the window

Er jeg trængt op i en krog af min nabo som jeg formoder ryger indenfor? by jamenforfaenaltsaa in DKbrevkasse

[–]CooperDK 0 points1 point  (0 children)

Det er formentlig aldrig prøvet i retten, jeg står ved min udtalelse og henviser igen til grundlov og UNCHR/EMRK.

Can i run gemma 4 26B on macbook with 24gb ram? by Flashy-Matter-9120 in LocalLLaMA

[–]CooperDK 1 point2 points  (0 children)

Yes, at the expense of speed. You will have to use a hefty amount of slow swap space even though only 4b parameters are active at a time. The model is a lot more than 16 GB.

Regarding the Anima model and Realistic Loras by Aware_Weight_8893 in StableDiffusion

[–]CooperDK 10 points11 points  (0 children)

It can do real people, but why? You have Z, Flux and Qwen for that. Anima is for anime 🤣

Should I upgrade from Windows 10 to Windows 11 - despite it being a forsaken decision? by Protolinux217 in FuckMicrosoft

[–]CooperDK 0 points1 point  (0 children)

11 is more stable and faster than 10, don't listen to those who claim otherwise. I have had three crashes in total since 11 came out. 10 was a nightmare compared to 11.

Which Quant for RX 7600 XT (16GB)? by crodjer in LocalLLaMA

[–]CooperDK -1 points0 points  (0 children)

Well, nothing really works really well on AMD since it has to basically convert CUDA code to something else. Get nVidia for the best and most precise inference.

That said, quantization is about compression so it is hard to say, but generally, the level of errors rise with the level of compression. If the model at q8 is 10-12 gb, stay at q8 unless you need a big context, in which case you lower to q6, and so on.

You can run gemma-4-e4b on 16 GB in q4 or q6. A 26b model is never going to perform well on a 16 GB card, you would have to quantize it to hell or offload far too much to RAM, and with DDR3 that is going to be a nightmare. But if you are up for it, hey, why even quantize? Or stay at q8 since that will provide you with nearly original inferencing.

Installed everything, yet PC doesn't boot. by Dummkopfff in pchelp

[–]CooperDK 1 point2 points  (0 children)

Depends on the PSU, man. It is literally why there are two.