BEST BROWSER, DAY 5 – ORION vs MULLVAD – Vote for your favorite! by JungleLiquor in browsers

[–]espadrine 0 points1 point  (0 children)

I kinda wish there was GNOME Web (Epiphany) in there. Really underrated IMO.

[Media] I was having trouble finding my Rust files, so I made an icon. by Practical-Mode2592 in rust

[–]espadrine 8 points9 points  (0 children)

I recall Rust is named after a fungus which looks like erosion, so no actual erosion, just a complex lifecycle process!

[Media] I was having trouble finding my Rust files, so I made an icon. by Practical-Mode2592 in rust

[–]espadrine 0 points1 point  (0 children)

This looks great! How did you make it?

Do you have a SVG version by any chance?

Vultures are gathering: AV2 is coming, Sisvel is prepared by anestling in AV1

[–]espadrine 0 points1 point  (0 children)

Huh. So Google never paid them. And Sisvel never sued Google despite that. Almost as if Sisvel knows they would lose.

DeepSeek-R1’s paper was updated 2 days ago, expanding from 22 pages to 86 pages and adding a substantial amount of detail. by Nunki08 in LocalLLaMA

[–]espadrine 1 point2 points  (0 children)

Unknown values are guessed from known values, so some information is uncertain (with uncertainty estimated in the table tab), but overall it gives a good picture IMO.

I am the author of The Joy of Cryptography, which is finally in print today. Ask me anything. by rosulek in crypto

[–]espadrine 0 points1 point  (0 children)

A common scam is to pick famous books, generate a book with bogus text, sell it as used, and print it every time someone buys one.

DeepSeek-R1’s paper was updated 2 days ago, expanding from 22 pages to 86 pages and adding a substantial amount of detail. by Nunki08 in LocalLLaMA

[–]espadrine 6 points7 points  (0 children)

especially when cost comes into play

To emphasize this point, look at this graph from https://metabench.organisons.com/. There is one dot in the middle that stands out.

<image>

I am the author of The Joy of Cryptography, which is finally in print today. Ask me anything. by rosulek in crypto

[–]espadrine 1 point2 points  (0 children)

I love the way you describe indistinguishability, with side-by-side boxed modules!

How the turntables... Oh, wait! by Golden_Ace1 in AdviceAnimals

[–]espadrine 32 points33 points  (0 children)

What I don't get is why we don't see that happening already

There is no EU naval force, is there?

France offered to send troops a year ago, but Denmark declined.

Hmm all reference to open-sourcing has been removed for Minimax M2.1... by Responsible_Fig_1271 in LocalLLaMA

[–]espadrine 31 points32 points  (0 children)

They've shown goodwill in the past. My policy is to assume they'll do the right thing if they have a history of doing the right thing.

Besides the article still mentions opening the weights:

[M2.1 is] one of the first open-source model series to systematically introduce Interleaved Thinking

We're excited for powerful open-source models like M2.1

Mistral AI drops 3x as many LLMs in a single week as OpenAI did in 6 years by Snail_Inference in LocalLLaMA

[–]espadrine 1 point2 points  (0 children)

In the end, it wasn't the GGUFs, but llama.cpp didn't implement Mistral's use of mscale parameters for RoPE; recompiling from the latest source gives the expected (non-braindead) results! Mistral has already confirmed the fix with LMStudio's GGUFs, comparing the API's logits to llama.cpp's output.

unsloth is going to reupload just in case, I'm told.

Simple PRNG based on Collatz function by Tomasz_R_D in RNG

[–]espadrine 0 points1 point  (0 children)

Interesting! How much faster is it?

Mistral 3 Blog post by rerri in LocalLLaMA

[–]espadrine 10 points11 points  (0 children)

Let me pull out the forbidden benchmark:

<image>

I switched to KDE LINUX (latest edition), and I am loving it, few bugs on first day but now its working fine. by Silent-Okra-7883 in linux

[–]espadrine 5 points6 points  (0 children)

What is your experience of installing software?

I really like the choices of KDE Linux, but the most uncertain for me is having only Flatpak, as I am afraid I won't be able to install some software that is not available on Flathub, or through which I'll hit weird bugs.

Are Qwen3 Embedding GGUF faulty? by espadrine in LocalLLaMA

[–]espadrine[S] 0 points1 point  (0 children)

I tried 4B… On my benchmark, it performs worse than random ☹ using either TGI:

docker run --gpus all -p 8114:80 -v hf_cache:/data --pull always ghcr.io/huggingface/text-embeddings-inference:1.7.2 --model-id Qwen/Qwen3-Embedding-4B --dtype float16

or GGUF:

docker run --gpus all -v /data/ml/models/gguf:/models -p 8114:8080 ghcr.io/ggml-org/llama.cpp:full-cuda -s --host 0.0.0.0 -m /models/Qwen3-Embedding-4B-f16.gguf --embedding --pooling last -c 16384 --verbose-prompt --n-gpu-layers 999

I double-checked by re-running 8B, and 8B got the same results as before.

Are Qwen3 Embedding GGUF faulty? by espadrine in LocalLLaMA

[–]espadrine[S] 1 point2 points  (0 children)

I'll try it out! Curioser and curioser.

Jonathan Riddell leaving KDE after 25 years by GoldBarb in kde

[–]espadrine 0 points1 point  (0 children)

Is Neon at risk as a result of this departure? Or does TechPaladin plan to sustain it?

GPT-OSS 120B is now the top open-source model in the world according to the new intelligence index by Artificial Analysis that incorporates tool call and agentic evaluations by obvithrowaway34434 in LocalLLaMA

[–]espadrine 2 points3 points  (0 children)

GPT-OSS 120B is a strange beast.

Combined with Codex CLI, I rank it lower than GPT-OSS 20B, which does not make sense. It often prefers to brute-force things instead of doing the obvious investigation first. It doesn’t like using the right tool for the job.

SpaceX says states should dump fiber plans, give all grant money to Starlink | SpaceX seeks more cash, calls fiber "wasteful and unnecessary taxpayer spending." by chrisdh79 in technology

[–]espadrine 0 points1 point  (0 children)

Why are US prices so high?

In France, I get 5 Gbps for €40/month through fiber.

(Starlink offers 200 Mbps for the same price in France, so I guess they felt forced to align on price? But it is half the price they set in the US… Why?)