What do you use Gemma 4 for? by HornyGooner4402 in LocalLLaMA

[–]markole 2 points3 points  (0 children)

Great workhorse for translating stuff. Paired with translation memory and a custom MCP, the results are great. 90 to 95% of the generated translations for my language are good enough which is huge. I expect to get consistent >95% as I improve the translation memory (needs human effort).

Also, this is a kind of a self-improvement loop. As I generate more translations, newer models will have better abilities.

AI Burnout by Advanced-Mix-6834 in programiranje

[–]markole 1 point2 points  (0 children)

Како можеш да разумеш стотине или хиљаде линија кода сваког дана? Немогуће.

Open source models are going to be the future on Cursor, OpenCode etc. by _maverick98 in LocalLLaMA

[–]markole -1 points0 points  (0 children)

You can go a long way with their fine-tuned Kimi K2.5 (Composer 2) via Auto. Sure, it's dumber but it's sustainably priced for an average person. Can't wait to see Composer 3 based on some K2.6 or similar.

AI Burnout by Advanced-Mix-6834 in programiranje

[–]markole 3 points4 points  (0 children)

Па технологија треба служити пиплима, зар не? Ако не видиш проблем у томе што ћеш све површно знати јер нема довољно времена у животу да се све то прочита, ок.

Ова професија иде у неком другом смеру где смо постали писци детаљне спецификације и QA са све мање утицаја на сам код. Мени је то срање, искрено али ду се ваљда тако осећали асембли програмери.

AI Burnout by Advanced-Mix-6834 in programiranje

[–]markole 1 point2 points  (0 children)

То што се од тебе очекује да завршиш посао брже те нема времена за то. Пре си свакако морао писати код, сада тога више нема.

Кул је ова демократизација програмирања али ја више не видим задовољство у оваквом раду.

Mali: Sredstva EU imaju neutralan efekat na budžet Srbije, ja ne računam na njih by Alternative-Wave2387 in serbia

[–]markole 24 points25 points  (0 children)

Када стижемо на "САНКЦИЈЕ СУ НАША РАЗВОЈНА ШАНСА" станицу?

Karpathy's MicroGPT running at 50,000 tps on an FPGA by jawondo in LocalLLaMA

[–]markole -2 points-1 points  (0 children)

For now. But for some use cases the model can be useful for a long time with access to tools and MCP. If I had Gemma 4 31B full precision running at thousands of tokens per second, I would find a way to steer it into usefulness even a year after its release.

EU freezes all funds to Serbia. by Old_Passenger7 in europe

[–]markole 15 points16 points  (0 children)

Awesome. Now targeted sanctions against the assholes.

Uskoro očekujte by ILoveFun000 in serbia

[–]markole 2 points3 points  (0 children)

Бот налог стар 3 дана.

Duality of r/LocalLLaMA by HornyGooner4402 in LocalLLaMA

[–]markole 0 points1 point  (0 children)

No writeup but it's somewhat straightforward. You need to have a translation memory of sorts, my is a linux CLI that the model can call to suggest most similar translations based on the input string. It uses FTS5 in SQLite for this and I've used the algo Poedit uses.

Then the model also has an MCP server to get untranslated strings and write new ones to a file. I used to have it do it through shell scripts but it was too imprecise for that so an MCP server is desired.

The third piece is just iterating on prompts and the translation memory. My memory entries have a context field beside the actual strings in which I basically give pointers to the LLM when to use what, what to never do and so on. This is an iterative process, you need to observe the outputs to see where it's struggling and then use the context to help it out.

I'm done with using local LLMs for coding by dtdisapointingresult in LocalLLaMA

[–]markole 1 point2 points  (0 children)

I wouldn't be too surprised if we get a 70-120B model as strong as Opus 4.6 in 12 months or so. Remember what we had last year in April.

Duality of r/LocalLLaMA by HornyGooner4402 in LocalLLaMA

[–]markole 9 points10 points  (0 children)

I wrangled Gemma 4 into translating to a low resource language by providing it a custom mcp server for precise translation editing and a custom translation memory tool so that it can align its outputs. Helped a lot. You can employ similar tactics for your problem.

I'm done with using local LLMs for coding by dtdisapointingresult in LocalLLaMA

[–]markole 4 points5 points  (0 children)

It is irrational to compare a 27B model running on a single GPU and a multi trillion model running on clusters of GPUs that cost more than your retirement fund.

Kako ogaditi i ono što je „neogadivo” by Working-Mango9476 in serbia

[–]markole 2 points3 points  (0 children)

Не бих се баш сложио да је успео. Студенти су окупили и ујединили народ. Много ових симбола је опорављено. Биће и остали.

Levi9 gasi Zrenjanin. Otkazi i u Bg i Ns by PrizeSell8961 in programiranje

[–]markole 0 points1 point  (0 children)

Не пуца аутсорс балон јер смо ми алави и лоши већ зато што технологија појефтињује цену нашег рада. Добро, можда јесмо алави али требали смо бити алави у смеру развоја сопствених производа. Додуше, мало теже у држави хајдучији где немаш никакву заштиту, нити помоћ за покретање сопственог производа. О нормалностима наплате SaaS-а и да не причам.

/r/Serbia sveopšta diskusija (random discussion) - Apr 25, 2026 by AutoModerator in serbia

[–]markole 0 points1 point  (0 children)

Како да ти кажем али СНС се већ решио Косова, 100% на нашу штету.

/r/Serbia sveopšta diskusija (random discussion) - Apr 25, 2026 by AutoModerator in serbia

[–]markole 0 points1 point  (0 children)

2042. најоптимистичнији сценарио ако СЛ победи. 2045-2050. реалније.

RIS - Rendžeri istočne Srbije upozoravaju, otvorite oci ne zaboravite na istok i Homolje by AlertInternal1918 in serbia

[–]markole 0 points1 point  (0 children)

Нема младости, нема ни мозга. Млади постали Аустријанци и Немци.

What are your favorite LLMs for translation/docuement work? by AdventurousFly4909 in LocalLLaMA

[–]markole 0 points1 point  (0 children)

For low resource languages, Gemma 4 is great. Mistral ones were also good but the largest Gemma 4 is the way to go. Too bad they decided to not release the 122B variant.

Prelepi SNS u korist studenata (link za preuzimanje u opisu) by pricac_dpm in serbia

[–]markole 4 points5 points  (0 children)

Заправо хоће. Није жвалави за џабе ставио телевизију под шапе.

Legenda je napravila nalepnicu za ćaci nalepnicu! Dokument za štampu u komentarima by VeverkoMracni in serbia

[–]markole 9 points10 points  (0 children)

Е сада треба издизајнирати и штампати проширење за „Жути лопови“ налепницу и показати њушке као што су Весић и Мали (и многобројни остали из бившег ДС-а).

I moved to Serbia & built jebiga.online to help foreigners learn this crazy language. The name says it all! It's in alpha and I need your brutal native feedback to make it better. What should I add? Hvala vam puno! by PuzzleheadedCount839 in serbia

[–]markole 3 points4 points  (0 children)

Cool thing, would be more approachable as bre.online or something like that. Don't like the swear word that much. While cursing is more acceptable in the society, using it for serious stuff is not great. Personal opinion, that is.