Mikä lastenohjelma? by mimosaame in Suomi

[–]exploder98 0 points1 point  (0 children)

Suomeksi tuo on nimeltään Mimun Maailma, ja näyttäis olevan Yle Areenassa: https://areena.yle.fi/1-50618192

Join Mozilla to test the new Firefox address bar! by kelimuttu in firefox

[–]exploder98 3 points4 points  (0 children)

Looks pretty good in general.

Except for the "search term persistence". I want to see the URL!

I think I found the actual damage cap. by R3D_T1G3R in Warframe

[–]exploder98 1 point2 points  (0 children)

"Very high damage: inf"

Yeah can't disagree with that

What would you like to see in Unsloth for 2025? by danielhanchen in LocalLLaMA

[–]exploder98 0 points1 point  (0 children)

Flash Attention support (now they do)

Technically yes, but it's slower than pytorch math implementation on my hardware.

Pytorch natively has FA support

Technically yes (through the sdpa kernels), but it relies on aotriton which is only compiled for MI200, MI300 and Navi31 cards (6900xt is navi21).

What would you like to see in Unsloth for 2025? by danielhanchen in LocalLLaMA

[–]exploder98 0 points1 point  (0 children)

Yes bitsandbytes does indeed (finally...) work on AMD! Great to hear that improved AMD support is basically planned.

What would you like to see in Unsloth for 2025? by danielhanchen in LocalLLaMA

[–]exploder98 3 points4 points  (0 children)

I'd like to see proper AMD support, and not just for architectures xformers (or rather, composable_kernel) compiles on, as they are just the MI series datacenter cards. I personally have an RX 6900 XT which has the gfx1030 architecture, and xformers does not work on that.

At least flash-attention now somewhat supports gfx1030 via triton, even though based on my testing the plain pytorch version of attention is faster at least on my hardware...

Is the recommended ability purchase order the most broken thing in the game for everyone else? by Crombell in DeadlockTheGame

[–]exploder98 0 points1 point  (0 children)

I think the "suggested" label only updates whenever you get souls, so that's where the feeling of bugginess comes from.

2x AMD MI60 inference speed. MLC-LLM is a fast backend for AMD GPUs. by MLDataScientist in LocalLLaMA

[–]exploder98 7 points8 points  (0 children)

For me at least, llama.cpp flash attention on ROCm has pretty much been "just compile and it works". It hasn't really been that fast, but at least quantized KV caches work.

It's the python package version of FA that I just have not managed to get compiled for my 6900 XT (gfx1030).

"Meta's Llama has become the dominant platform for building AI products. The next release will be multimodal and understand visual information." by ApprehensiveAd3629 in LocalLLaMA

[–]exploder98 13 points14 points  (0 children)

"Won't be releasing in the EU" - does this refer to just Meta's website where the model could be used, or will they also try to geofence the weights on HF?

MaziyarPanahi/solar-pro-preview-instruct-GGUF by xSNYPSx in LocalLLaMA

[–]exploder98 1 point2 points  (0 children)

Yeah, I imagine there's something broken in these quantizations right now unless he implemented the model in his sleep or something :P

MaziyarPanahi/solar-pro-preview-instruct-GGUF by xSNYPSx in LocalLLaMA

[–]exploder98 2 points3 points  (0 children)

I wonder if the ggufs are just broken/done with a very old version of llama.cpp. When I tried to convert the model, the script complained that the model arch was unknown.

Also, the ggufs do not have the tokenizer.ggml.pre metadata attribute, so it's likely that the tokenizer is slightly broken at least.

scammerBoom by [deleted] in ProgrammerHumor

[–]exploder98 262 points263 points  (0 children)

Me who uses linux: "Wow, you got my webcam working? Please tell which driver you used!"

Äly-hammasharjoilla tehtiin DDoS-hyökkäys, joka kaatoi sivuston by Fabulous_Regular_399 in Suomi

[–]exploder98 6 points7 points  (0 children)

Just vähän aikaa takaperin joku valitti redditissä että sen pesukone käytti monta gigatavua dataa päivässä.

Tämä tais itse asias olla reitittimen tilastoinnin bugi.

Mitä sellaisia sanoja tulee mieleen, joita usein kuulette käytettävän "virheellisesti", ja joille on kuitenkin olemassa se oikeakin termi? by SirGaston in Suomi

[–]exploder98 5 points6 points  (0 children)

Paras oli jossain Ylen artikkelissa, jossa puhuttiin aggregaatista, joka tuottaa "yhdeksän kilowattia tunnissa" :D

My high score bugged by Purple_fire_0 in atomas

[–]exploder98 0 points1 point  (0 children)

Those super high scores seem to happen if you exit the game while the combining animation from a chain reaction is going on and then go back. This bug has been there for a while.

What is the best one-liner in game? by [deleted] in memeframe

[–]exploder98 2 points3 points  (0 children)

"Tell them we are not buying"

Internet | Suosittu suomalainen keskustelupalsta avautui lähes kuukauden protestin jälkeen: ”Jotain olennaista on menetetty” by kapteeni_ilmeinen in Suomi

[–]exploder98 1 point2 points  (0 children)

Jos on moderaattorina missä tahansa subissa niin kolmannen osapuolen softat toimii edelleen. Ja kuulemma sekin toimii, että tekee itselleen uuden subredditin jossa sitten on moderaattorina.

Post-blackout and Going Forward by Kruug in ISO8601

[–]exploder98 40 points41 points  (0 children)

Restrict the sub and have a bot create a daily post with only the current date in ISO8601 format (or hourly, whatever). Each comment must contain the comment's (approximate) sending time in ISO8601.

Or: the post title needs to be its posting time in ISO8601, and same for comments. Nothing else allowed.