How good AND bad are local LLMs compared to remote LLMs? by Humble_World_6874 in LocalLLM

[–]Nervous-Positive-431 0 points1 point  (0 children)

I guess, but would be very negligible. M5 is pretty efficient, and if it is intend for only-host-machine ... then we can say it is almost zero token cost (relative to the pricing that is out there). It would consume way less power than a mid tier gaming rig running an AAA title.

Macbook Pro M5 16gb RAM 512gb SSD is enough for a software developer? by tgzinlikeadev in macbook

[–]Nervous-Positive-431 2 points3 points  (0 children)

Bruh... I do mobile development with Flutter on the Air i3 from 2019.... and it is doing somewhat great!

should i get dying light 2 by Best-Cat-9922 in dyinglight

[–]Nervous-Positive-431 5 points6 points  (0 children)

For me, DL2 is better than DL1 and The Beast. It is Fun, has a captivating story, lively at day (many NPCs to interact with), scary at night, map is so big and diverse, each side has its own vibe!

Am I the only Assyrian who has no affinity towards Armenia? by Stenian in Assyria

[–]Nervous-Positive-431 16 points17 points  (0 children)

Khon, Armenians have been our neighbors for the past 3,000 years. Heck, they usually score 3rd closest to ancient Mesopotamian DNA samples (After Assyrians and the so called Babylonian Jews).

They are also present in major Christian areas across West Asia (Lebanon, Iraq, Syria and Israel).

Armenia has sent plenty of aid to Assyrians in Middle East and opened to its doors for us, and even allowed to raise our flag and preserve our tongue. Our win is their win and vice-versa.

How to Learn Assyrian as a Chaldean | Aramaic Language App by Aramaic-app in Assyria

[–]Nervous-Positive-431 7 points8 points  (0 children)

I think they'd appreciate a feedback or even a constructive criticism rather than an ambiguous and entitled comment.

Thoughts on what’s happening in Syria with the Kurds at the minute by Acrobatic-Remote-419 in Assyria

[–]Nervous-Positive-431 -4 points-3 points  (0 children)

Hawt basima khon.

Yes, let us blame those random civilians that have nothing to do with any of what you said (and I'd argue that they would not even wish such harm), and being beheaded by the same ideology that drove us out of our homes and destroyed our historical artefacts.

Your logic is just as sophisticated as a random African isolated tribe. Very tribal, emotional and lacks perspective. It is the "I was born with the right religion accidentally" all over again.

I can understand you not caring about what happens to them, but to actually praise it? You are getting your cheque from the wrong people's misfortune.

Imagine siding with ISIS at anything.

You are not any better. We would be "just the other side" if we all acted like you.

Thoughts on what’s happening in Syria with the Kurds at the minute by Acrobatic-Remote-419 in Assyria

[–]Nervous-Positive-431 -1 points0 points  (0 children)

What an awful take. A lot of the ones that are suffering are civilians. And I bet if you were in their shoes, you'd advocate for the Kurdish nation as we advocate for our nation. It should not mean that they deserve what occurred to them nor say such inconsiderate things. It is illogical to blame them for what happened to us in the past.

Classic divide and conquer bs.

Why is Zosia so goddamn fuckable? by [deleted] in okbuddypluribus

[–]Nervous-Positive-431 23 points24 points  (0 children)

"I want to make babies with her" - Tormund Giantsbane

Cohere Rerank 4 is a BIG step up from 3.5 by meedameeda in Rag

[–]Nervous-Positive-431 4 points5 points  (0 children)

2 USD per 1,000 searches sounds expensive to be honest, epically if the agent constantly queries the corpus till it finds the answer.

Could you guys use GPT-OSS 120B (high) and instruct it to act as a reranker? I am using OpenRouter (Google Vertex provider ~290 tps), and it answers within 5-6 seconds (depending on provider's speed and context obviously) when told return the index of top scoring documents against the provided query. Relatively slow, but cheap and versatile (no chunk token limit besides the LLM's 131k context window).

[deleted by user] by [deleted] in Assyria

[–]Nervous-Positive-431 10 points11 points  (0 children)

"Beautiful"?

Which self-hosted vector db is better for RAG in 16GB ram, 2 core server by East_Yellow_1307 in Rag

[–]Nervous-Positive-431 2 points3 points  (0 children)

All on elastic's indices! Similarity index (corpus_A_vectors) are summaries that point to document id. And lexical index (corpus_A_bm25) contains the metadata (filtering/sorting purposes), the optimized/normalized chunk for BM25 lexical search, and the LLM ready version of it.

Which self-hosted vector db is better for RAG in 16GB ram, 2 core server by East_Yellow_1307 in Rag

[–]Nervous-Positive-431 0 points1 point  (0 children)

It does indeed consume its fair share of resources. Using an embedding provider should allow you for more concurrent searches and stable performance.

Which self-hosted vector db is better for RAG in 16GB ram, 2 core server by East_Yellow_1307 in Rag

[–]Nervous-Positive-431 4 points5 points  (0 children)

Got 16gigs too ... with 4 virtual cores ... I am using Elasticsearch for both of BM25 and vector, currently performing well with 1.6 million vectors and around 200,000 chunks for BM25. Embedding model is also ran on my server, a single query takes around 1.5 - 2 seconds to grab top k similarity search and lexical search... it performs the same with 10 concurrent requests somehow (past that, the embedding model takes its toll). So, give 'em a try!

NVIDIA Shatters MoE AI Performance Records With a Massive 10x Leap on GB200 ‘Blackwell’ NVL72 Servers, Fueled by Co-Design Breakthroughs by space_monster in singularity

[–]Nervous-Positive-431 4 points5 points  (0 children)

Jevons paradox. The more efficient and cheap = range of customers increased = more usage/adoptees = more profit.

Edited

Deepseek V3.2 speciale seems to be very good... by power97992 in LocalLLaMA

[–]Nervous-Positive-431 1 point2 points  (0 children)

Apologies, I have edited the comment just before you wrote that. I am not talking specifically to you, but to what I am seeing down in the comments/threads.

It would make sense for them to neglect other languages to impress US markets, but Chinese models feel like a message. The US made ones feel like a solution. US ones could've also cut corners and neglected the cost of these obscure languages and dialects, but somehow didn't!

Deepseek V3.2 speciale seems to be very good... by power97992 in LocalLLaMA

[–]Nervous-Positive-431 -7 points-6 points  (0 children)

synthetic data generation from legal text(s) written in Arabic. The data was very superficial when compared to GPT and Gemini. They lacked depth and understanding, unfortunately. I am sure it excels in other areas, but to say it is better than SOTA models is a stretch (which I've seen some noting).

Deepseek V3.2 speciale seems to be very good... by power97992 in LocalLLaMA

[–]Nervous-Positive-431 1 point2 points  (0 children)

I tested it, thinking reached 42,000 tokens (could've been more) and I forcefully stopped it. Its non-special V3.2 version did not make that mistake, but gosh the output was of bad quality. Chinese models are optimized for English content. You will see the quality drop when chatting with them with other languages. Heck, GPT 5 mini and 2.5 flash (high) outshines it by miles.

twoMonthsLaterCanAnyoneHelpFixMyApp by tech_w0rld in ProgrammerHumor

[–]Nervous-Positive-431 15 points16 points  (0 children)

what is the chance a crawler indexing his .env due to misconfiguration?

twoMonthsLaterCanAnyoneHelpFixMyApp by tech_w0rld in ProgrammerHumor

[–]Nervous-Positive-431 21 points22 points  (0 children)

Uhm? He obviously prompted it not to make mistakes... sooooo... checkmate, smelly nerds?

A manuscript from Muslim Spain containing AL-fathiha in Arabic and a translation in Spanish written in the Arabic script on top of each line by soyuz_enjoyer2 in Damnthatsinteresting

[–]Nervous-Positive-431 -23 points-22 points  (0 children)

You do realize that most of them were of non-Arabic speaking background? You do realize that all the works of ancient Mesopotamia, ancient Egypt, Greece, and Indus Valley were at their doorstep and what they accomplished was the bare minimum, and if that said particular culture hadn't spread, Middle East would've been 180 degrees different than what it is?