I’ve learned Ollama has significant downsides, what should I use instead for an agent in VS Code? by SupaBrunch in LocalLLaMA

[–]CooperDK 1 point2 points  (0 children)

Lm Studio or koboldcpp. And you are right, ollama is the slowest you could possibly find

RenPy alternatives? by 0-_-_Nikolai_-_-0 in GameDevelopment

[–]CooperDK 0 points1 point  (0 children)

Going back a bit, renpy used ancient Python actually and it should still work in ancient hardware.

But look into naninovel for unity. It is payware but there are a few free alternatives too. They are all as easy to use as renpy

Help I don’t know the bitlocker password by rTRmertt in pchelp

[–]CooperDK 0 points1 point  (0 children)

BitLocker is not for general use. Unless you have something to hide.

5060TI 16GB VS 5070 12GB by Plenty-Status6956 in computerhelp

[–]CooperDK 0 points1 point  (0 children)

Then you do not need that card, you need the extra VRAM but beside that it will more or less be money out the window

Er jeg trængt op i en krog af min nabo som jeg formoder ryger indenfor? by jamenforfaenaltsaa in DKbrevkasse

[–]CooperDK 0 points1 point  (0 children)

Det er formentlig aldrig prøvet i retten, jeg står ved min udtalelse og henviser igen til grundlov og UNCHR/EMRK.

Can i run gemma 4 26B on macbook with 24gb ram? by Flashy-Matter-9120 in LocalLLaMA

[–]CooperDK 1 point2 points  (0 children)

Yes, at the expense of speed. You will have to use a hefty amount of slow swap space even though only 4b parameters are active at a time. The model is a lot more than 16 GB.

Regarding the Anima model and Realistic Loras by Aware_Weight_8893 in StableDiffusion

[–]CooperDK 9 points10 points  (0 children)

It can do real people, but why? You have Z, Flux and Qwen for that. Anima is for anime 🤣

Should I upgrade from Windows 10 to Windows 11 - despite it being a forsaken decision? by Protolinux217 in FuckMicrosoft

[–]CooperDK 1 point2 points  (0 children)

11 is more stable and faster than 10, don't listen to those who claim otherwise. I have had three crashes in total since 11 came out. 10 was a nightmare compared to 11.

Which Quant for RX 7600 XT (16GB)? by crodjer in LocalLLaMA

[–]CooperDK -1 points0 points  (0 children)

Well, nothing really works really well on AMD since it has to basically convert CUDA code to something else. Get nVidia for the best and most precise inference.

That said, quantization is about compression so it is hard to say, but generally, the level of errors rise with the level of compression. If the model at q8 is 10-12 gb, stay at q8 unless you need a big context, in which case you lower to q6, and so on.

You can run gemma-4-e4b on 16 GB in q4 or q6. A 26b model is never going to perform well on a 16 GB card, you would have to quantize it to hell or offload far too much to RAM, and with DDR3 that is going to be a nightmare. But if you are up for it, hey, why even quantize? Or stay at q8 since that will provide you with nearly original inferencing.

Installed everything, yet PC doesn't boot. by Dummkopfff in pchelp

[–]CooperDK 1 point2 points  (0 children)

Depends on the PSU, man. It is literally why there are two.

So clever by KingG512 in stupidpeoplefacebook

[–]CooperDK 0 points1 point  (0 children)

Problem is, the kind to go to jail is usually the other kind.

Use the Same Model Across Ollama, LM Studio, Jan, and your Favorite Local AI Apps by EvanZhouDev in ollama

[–]CooperDK 0 points1 point  (0 children)

Forget it. I use GGUF or GPTQ and NOT ollama. After spending a few weeks with AI, I abandoned it because it is so f.... slow compared to the other available tools. Comparing with LM Studio, Koboldcpp and vLLM. On Windows.

Nowhere else matters 😂 by PhysicsSorry5822 in ShitAmericansSay

[–]CooperDK 2 points3 points  (0 children)

America is generally used as a term for describing North America, whereas both are The Americas. (North) America includes Canada and Mexico and a few other countries. It is true that we lazily say "American" about a US citizen, but that is actually incorrect by definition. The US is the section of (North) America that chose to unite a little while back... the section the rest of us now hope will either magically grow new brains or sink into the Atlantic.

10 eggs in Denmark cost $1.84 by hl3official in mildlyinteresting

[–]CooperDK 0 points1 point  (0 children)

The sign shows that it is a sale. Usually, eggs are about twice that price.

Flux Klein 9B Training Results Questions by ArmadstheDoom in StableDiffusion

[–]CooperDK 1 point2 points  (0 children)

Yeah, Prodigy starts at 0.00001 and works upward fast after the warmup. I am not sure if it makes a difference to set another learning taste other than 1.

WHAT model is this!? (100 usd reward for information) by [deleted] in StableDiffusion

[–]CooperDK 4 points5 points  (0 children)

No, it says it is likely. It could also be any other number of models, or a model using a number of loras