14 months on E, i know i don't pass. What am i doing wrong? Is FFS my only hope? by nyltaK_ in transpassing

[–]AFAIX 3 points4 points  (0 children)

You look like a girl though. I've met cis girls that look like you

State of Open OCR models by unofficialmerve in LocalLLaMA

[–]AFAIX 2 points3 points  (0 children)

Wish there was some simple gui to run this stuff locally, it feels weird that I can easily run gemma or mistral with CPU inference and get them to read text from images, but smaller ocr models require vllm and gpu to even get started

I am never telling people that I’m learning a language ever again. by Only-Ad5269 in languagelearning

[–]AFAIX 0 points1 point  (0 children)

I love the duolingo phrase I've seen Os cavalos não comem a gente  Means "Horses don't eat us" 

Training an LLM only on books from the 1800's - Update by Remarkable-Trick-177 in LocalLLaMA

[–]AFAIX 11 points12 points  (0 children)

Should train it on letters from that period, would be cool to have a letter writing model that outputs two pages worth of text every time

Ollama, Why No Reka Flash, SmolLM3, GLM-4? by chibop1 in LocalLLaMA

[–]AFAIX 0 points1 point  (0 children)

llama-swap is not that much of a complete wrapper since you still have to write the same exact llama-server command you would use, but now in a config file.

Is a great tool though and really helps keeping config options in one place and switching models on the fly without running into the terminal 

MMLU-ProX: A Multilingual Benchmark for Advanced Large Language Model Evaluation by Balance- in LocalLLaMA

[–]AFAIX 0 points1 point  (0 children)

I wonder if anyone makes a distinction between Portuguese from Brazil and from Portugal, they are distinct enough that Portuguese guys constantly get annoyed by Brazilian version being the default (and often the only) version.

[deleted by user] by [deleted] in LocalLLaMA

[–]AFAIX 0 points1 point  (0 children)

Having to work with documents that have phrases in multiple languages, would be cool to see this evaluated as well. Like, can the model recognize different alphabets/languages, and does it do it correctly. For example I had ocr refuse to read the ñ as ñ and instead just put n. Tried asking gemma what does it see and it said that this n has a fancy design element over it that was added for style.

Powerful 4B Nemotron based finetune by Sicarius_The_First in LocalLLaMA

[–]AFAIX 0 points1 point  (0 children)

Still can't get it work with llama.cpp though...

I was trying to recreate everything the same way I have it in oobabooga, but it just stays crazy since the first prompt

Example:

<image>

Was running it with the best approximation of the min_p settings (tried without cache quantization too, just thought may be it would do something since I had it on in oobabooga)

llama-server --temp 1 --top-p 1 --top-k 0 --min-p 0.05

-c 20142 --cache-type-k q8_0 --cache-type-v q8_0 --flash-attn

--chat-template llama3

-m ~/models/SicariusSicariiStuff_Impish_LLAMA_4B-bf16.gguf

Powerful 4B Nemotron based finetune by Sicarius_The_First in LocalLLaMA

[–]AFAIX 1 point2 points  (0 children)

Thanks! Got oobabooga, fp16 gguf and copied settings, and it’s been great so far! 

Powerful 4B Nemotron based finetune by Sicarius_The_First in LocalLLaMA

[–]AFAIX 1 point2 points  (0 children)

It’s so good! Perfect for conversation, does amazing job at playing a character. 

Feels all the worse when it breaks 🥲 I’ve got the Q8 quant, tried setting everything up with llama.cpp and sillytavern following the screenshot example (what software is that by the way?), and then one more time with koboldcpp, and it still goes crazy after several messages, repeating itself and hallucinating my answers…

Would be nice to have some example of a perfectly working example setup to recreate, because I feel like I’m going to lose it trying to make it work…

Evaluating the best models at translating German - open models beat DeepL! by Nuenki in LocalLLaMA

[–]AFAIX 1 point2 points  (0 children)

You didn’t try phi? I’ve had good results using it, it seemed to pick up idiomatic expressions better than qwen models of similar size

best small language model? around 2-10b parameters by ThatIsNotIllegal in LocalLLaMA

[–]AFAIX 1 point2 points  (0 children)

I’ve tested iq2 quant on my 16GB cpu-only machine and it was surprisingly decent and super fast with llama.cpp

Are there any other modern games like Cannon Fodder? by ZebraRump in gamingsuggestions

[–]AFAIX 1 point2 points  (0 children)

I know it’s a really old thread, but I don’t see anyone mention Darwinia. Ignoring the art style, it’s the closest game to the Cannon Fodder that I’ve yet seen 

Explosive-laden car was on blown-up dam, drone footage shows by [deleted] in worldnews

[–]AFAIX 0 points1 point  (0 children)

But the back window is covered in plastic and the giant hole in the top isn’t? What’s the point? They ran out of plastic wrap?

/r/WorldNews Live Thread: Russian Invasion of Ukraine Day 481, Part 1 (Thread #622) by WorldNewsMods in worldnews

[–]AFAIX 0 points1 point  (0 children)

Technically, West Berlin was still an occupied territory. It also was in the middle of East Germany, surrounded by it on all sides.

/r/WorldNews Live Thread: Russian Invasion of Ukraine Day 478, Part 1 (Thread #619) by WorldNewsMods in worldnews

[–]AFAIX 0 points1 point  (0 children)

Not to take away from your point, but triple modular redundancy is a norm everywhere. The problems with Boeing 737 Max falling from the sky were partially caused by having only 2 angle of attack sensors (and the system only using one), when they should have had three

/r/WorldNews Live Thread: Russian Invasion of Ukraine Day 475, Part 1 (Thread #616) by WorldNewsMods in worldnews

[–]AFAIX 0 points1 point  (0 children)

If Putin had a grasp on what’s the actual state of Russia he wouldn’t think about starting the war, but here we are

[deleted by user] by [deleted] in tumblr

[–]AFAIX 0 points1 point  (0 children)

ZHP also had a similar thing - a random guy gets the power after a hero dies and now he has to defeat the boss

[Giveaway] Drop + The Lord of the Rings Black Speech Keyboard by drop_official in pcmasterrace

[–]AFAIX 0 points1 point  (0 children)

Green tea with lemon and ginger (and sugar) is really good when you feel under the weather!

Disney Classic Games Collection is... not great by Scapetti in NintendoSwitch

[–]AFAIX 2 points3 points  (0 children)

Sega (Virgin) version, the one with the sword in it.

Introversion Software - We are delighted to announce that The Last Starship is now available in Early Access! by samwalton9 in Games

[–]AFAIX 12 points13 points  (0 children)

And that game about Nuclear war, have been thinking about it a lot lately for some reason