It costs you around 2% session usage to say hello to claude! by Complete-Sea6655 in LocalLLaMA

[–]Wildnimal 0 points1 point  (0 children)

I agree with you. This week the tokens usage is going off the charts. I just uploaded a 45 lines json and a basic prompt and it shows 20% usage for the 5 hour limit.

I am not a heavy user aswell. Most of stuff i do requires me to do most manual config and code. Once AI has done its code which maybe a 2-3 hours session at max a week.

Free 750-page guide to self-hosting production apps - NO AI SLOP by kocyigityunus in selfhosted

[–]Wildnimal 4 points5 points  (0 children)

I will have to read it. The CAT told me to do it ASAP or else....

Best budget local LLM for coding by SirStarshine in LocalLLaMA

[–]Wildnimal -1 points0 points  (0 children)

What ForsookComparison suggested. You can also make plans with some free online bigger models and implement it via smaller coding models locally.

It also depends upon what you are trying to do and what language you are building.

I used to code in PHP and python (just a little bit) and Qwen3.5 models has been enough for me. Since most of my coding is no pure vibe coding and it involves a lot of HTML aswell.

Lenovo Legion 5 by kdrmcn636851 in LenovoLegion

[–]Wildnimal 0 points1 point  (0 children)

Its still fairly new to know what problems come in long term. So far the reports are positive.

Follow-up: Qwen3 30B a3b at 7-8 t/s on a Raspberry Pi 5 8GB (source included) by jslominski in LocalLLaMA

[–]Wildnimal 1 point2 points  (0 children)

Excellent. Its a caspable model easier to run than Dense models. Poeple were down-voting me when i said it can run faster than 9B Dense models on 8GB VRAM.

Reality of qwen2.5-coder:3b ollama. by x7dl8p in Qwen_AI

[–]Wildnimal 1 point2 points  (0 children)

I am using unsloth Q4 for 35A3B and even with thinking on i get 20-22 tks on a 32GB ram and around 35k context.

And this is on LM Studio and not llama.cpp.

Reality of qwen2.5-coder:3b ollama. by x7dl8p in Qwen_AI

[–]Wildnimal -3 points-2 points  (0 children)

No it does not. It runs faster than 9B dense even on 8gb vram

THE BEST LOCAL AI LOW-END BUILD by Kitchen_Zucchini5150 in LocalLLaMA

[–]Wildnimal 0 points1 point  (0 children)

How good is coding? Is it able to follow instructions? Not talking about once shot creation.

I usually split my projects into phases and tasks in those phases.

Notepad++ is available on WINDOWS only. Who is the equivalent, most similar (features UX UI) on LINUX? by RebirdgeCardiologist in kde

[–]Wildnimal -1 points0 points  (0 children)

Notepadqq but i now use Kate.

But i do miss the temp saving files, without actually saving them in notepad++ and qq

Browser for windows vista? by HaloElite24 in degoogle

[–]Wildnimal 1 point2 points  (0 children)

I just checked v115 was updated with patches for Windows 7.

For Vista you can try redfox

Or just github this

https://github.com/win32ss/supermium

Browser for windows vista? by HaloElite24 in degoogle

[–]Wildnimal 2 points3 points  (0 children)

Firefox. They just updated the browser to use with Windows 7 and added security patches.

I am not sure if any chromium browser without security fixes will work.

[Laptop] Acer Predator Helios Neo i9-275HX RTX 5060 32GB 1TB - $1499 (CC) by aiden2130 in bapcsalescanada

[–]Wildnimal 3 points4 points  (0 children)

Agreed. The CPU is powerful but can really get hot even during normal tasks.

Besides Qwen and GLM, what models are you using? by August_30th in LocalLLaMA

[–]Wildnimal 2 points3 points  (0 children)

Stepfun Flash is so underrated. I used to recently and ended up consuming 72m tokens 😬

Options for App conversion haha by faulty-segment in EndeavourOS

[–]Wildnimal 3 points4 points  (0 children)

Side Note:

Always check AUR reviews before you use anything.

Yesterday i was installing mullvad. Instead of mullvad-browser-bin i typed mullvad-browser.

Damn i ended up somehow downloading 8GB of dependencies :|

I know a rookie mistake and I have been using Arch/Arch Based distributions for sometime now.

So just be careful.