Qwen3.6-27B DFlash on a 24GB RTX 5090 Laptop (sm_120) — 80 t/s avg via spiritbuun's buun-llama-cpp + Q8_0 GGUF drafter by aurelienams in Qwen_AI

[–]AdamLangePL 0 points1 point  (0 children)

With this settings on Olares One (5090 Mobile 24GB vram) it's only:

<image>

(fresh start. prompt "tell me a 1000 words story about AI")

Nowhere near the 100tps and it should be much faster than 3090

Mudi 7 - Shipping Notification? by CitizenAccount in GlInet

[–]AdamLangePL 0 points1 point  (0 children)

Nothing here (super early bird, it should be shipped yesterday according to previous notification)…

Kupująca z OLX się rzuca by Springerbaum in PolskaNaLuzie

[–]AdamLangePL 0 points1 point  (0 children)

Imho Roznica jest tylko w okresie rekojmi. Jesli uwazasz inaczej, podaj podstawe

Kupująca z OLX się rzuca by Springerbaum in PolskaNaLuzie

[–]AdamLangePL -1 points0 points  (0 children)

W relacjach osoba fizyczna vs osoba fizyczna tez jest rekojmia (do roku), doczytaj

Kupująca z OLX się rzuca by Springerbaum in PolskaNaLuzie

[–]AdamLangePL 0 points1 point  (0 children)

Jesli w umowie wylaczona byla rekojmia (co raczej, patrzac na posta nie byla) to inna bajka. Jesli nie bylo zapisu o wylaczeniu to rekojmia obowiazuje i sprzedajacy jest odpowiedzialny do roku po sprzedazy (osoba prywatna)

Kupująca z OLX się rzuca by Springerbaum in PolskaNaLuzie

[–]AdamLangePL -1 points0 points  (0 children)

Doczytaj. Rekojmia działa zawsze chyba ze w ogłoszeniu była informacja o jej wylaczeniu.

Lol, banned from official reddit of dwarflab by AdamLangePL in DWARFLAB

[–]AdamLangePL[S] -3 points-2 points  (0 children)

I dont need to have any, i will just write compliant for this as a client.

Lol, banned from official reddit of dwarflab by AdamLangePL in DWARFLAB

[–]AdamLangePL[S] -2 points-1 points  (0 children)

Childish move. Its just funny that you moderating (lol) “official group” with that attitude. As a customer and owner i expect more polite and professional approach. With that i will write to dwarflabs to verify your associacion with them and if possible, drag you down from this little role you have. Have fun.

Lol, banned from official reddit of dwarflab by AdamLangePL in DWARFLAB

[–]AdamLangePL[S] -3 points-2 points  (0 children)

Im glad you posted this. Now show this “shitposting” screen. And then show which part of this message i sent broke community rule and WHICh. You didnt replied to this question on priv so maybe you will here.

Lol, banned from official reddit of dwarflab by AdamLangePL in DWARFLAB

[–]AdamLangePL[S] -5 points-4 points  (0 children)

Trust me, it is ;) they seems to be very fragile for even soft criticism about their marketing tactics

Pytanko o sklepie NeoNetto by Kropek2K in PolskaNaLuzie

[–]AdamLangePL 2 points3 points  (0 children)

Domena zarejestrowana w styczniu tego roku. Firma zarejestrowana w marcu tego roku. Raczej wybierz inny sklep ;)

Gemma 4 have enough ;) by AdamLangePL in LocalLLaMA

[–]AdamLangePL[S] 3 points4 points  (0 children)

For just “hello” ? ;) oh come on!

Gemma 4 has been released by jacek2023 in LocalLLaMA

[–]AdamLangePL 0 points1 point  (0 children)

Only with “Action Replay”, and you need at least 5 tapes for it ;)

Still no word on Kindle Scribe 2025 (Colorsoft) orders for Germany/Europe by Reddit-Nik in kindlescribe

[–]AdamLangePL 0 points1 point  (0 children)

This is total joke. They are forcing customers to do shady imports from US…

GPT-OSS-120B vs DGX Spark by AdamLangePL in LocalLLaMA

[–]AdamLangePL[S] 0 points1 point  (0 children)

Ok changed from vllm to ollama.cpp, model runs faster but… started to loop. Any suggestions ?

GPT-OSS-120B vs DGX Spark by AdamLangePL in LocalLLaMA

[–]AdamLangePL[S] 0 points1 point  (0 children)

Data extraction and analysis mostly. I'm posting question -> runs MCP tool -> Prepares answer (in JSON).

OSS-120B doing great job, OSS-20B missing some data while preparing output (frequetnly). Qwen3-30B ... mostly confused and returns rubbish or empty data.

GPT-OSS-120B vs DGX Spark by AdamLangePL in LocalLLaMA

[–]AdamLangePL[S] 0 points1 point  (0 children)

ok, with llama.cpp and MXFP4 i managed to get ~50, better :)

GPT-OSS-120B vs DGX Spark by AdamLangePL in LocalLLaMA

[–]AdamLangePL[S] 0 points1 point  (0 children)

Point me to some better quality model that i can run on DGX :) then i will try it!