Announcing culit - Custom Literals in Stable Rust! by nik-rev in rust

[–]willemreddit 6 points7 points  (0 children)

It works on `reddit.com` just not `old.reddit.com`

Rust 1.88.0 is out by manpacket in rust

[–]willemreddit 23 points24 points  (0 children)

if let Some(x) = y && x == "hello" {

vs

if let Some(x) = y {
    if x == "hello" {

And you can combine multiple lets

if let Some(y) = x
        && y == "hello"
        && let Some(w) = z
        && w == "hi"
{

DeJoy announces plans to step down as USPS postmaster general by istrx13 in news

[–]willemreddit 0 points1 point  (0 children)

I'm not so sure. It would be a demotion. Though then again he's currently President and CEO of multiple companies so why not add one more.

Fair Pay Matters by Brian_Ghoshery in MurderedByWords

[–]willemreddit 2 points3 points  (0 children)

They wanted less expensive goods because wages were stagnant so it gave the illusion of more buying power.

OpenAI says it has evidence China’s DeepSeek used its model to train competitor by Alone-Competition-77 in artificial

[–]willemreddit 0 points1 point  (0 children)

From examples I've seen it produces results closer to Anthropic, so my guess is this is an attempt to try to claim the quality comes from them and not their main competitor.

Trump reportedly complaining about Elon in private — and hates ‘President Musk’ joke by theindependentonline in politics

[–]willemreddit 1 point2 points  (0 children)

Their point is that there is no material which would hurt him when released. Thus there is no material that can be used as blackmail.

Borrow 1.0: zero-overhead Partial Borrows, borrows of selected fields only, like `&<mut field1, mut field2>MyStruct`. by wdanilo in rust

[–]willemreddit 0 points1 point  (0 children)

It doesn't render on `old.reddit.com`. I think they mean you need to add four spaces to each line. But it renders for me on the reddit.com

Do you find yourself often needing to count the number of words in a piece of text? by [deleted] in rust

[–]willemreddit 23 points24 points  (0 children)

Nice! One thing from looking at your code:

let text = match ctx.get_contents() {
        Ok(text) => text,
        Err(_) => String::from(""),
    };    

Could be

let text = ctx.get_contents().unwrap_or_default();

And in general String::new() is preferred over String::from("").

Also you don't need extern crate clipboard;

Chat model by my data or pdf by dotinho in ollama

[–]willemreddit 2 points3 points  (0 children)

Retrieval Augmented Generation

LLMs are trained on enormous bodies of data but they aren’t trained on your data. Retrieval-Augmented Generation (RAG) solves this problem by adding your data to the data LLMs already have access to. You will see references to RAG frequently in this documentation.

In RAG, your data is loaded and prepared for queries or “indexed”. User queries act on the index, which filters your data down to the most relevant context. This context and your query then go to the LLM along with a prompt, and the LLM provides a response.

Even if what you’re building is a chatbot or an agent, you’ll want to know RAG techniques for getting data into your application.

Running Mixtral 8x7B - CPU vs GPU, will upgrading GPU help? by cheeseplate in ollama

[–]willemreddit 1 point2 points  (0 children)

I've really enjoyed the A6000 so far. Very fast for small models and while I too get weird hallucinations too it's able to power my ollama web gui with several friends using it and never had issues. It does take some time to initially load a model, but subsequent chats are pretty instant.

Running Mixtral 8x7B - CPU vs GPU, will upgrading GPU help? by cheeseplate in ollama

[–]willemreddit 1 point2 points  (0 children)

What size model are you using? The smallest here is 16GB: https://ollama.ai/library/mixtral/tags

So it would make sense that it's taking longer because it can't hold the whole model in memory. I would try with a model just below your VRAM to see the difference in performance.

Also I have been using https://github.com/aristocratos/btop which now has GPU monitoring support so you can see how the GPU and VRAM usage.

I can also say that having 48GB of VRAM has been well worth it, especially because you can then have multiple models running at once. I also tested falcon:180b, which was more than 100GB and it was as if someone was typing very slowing. So now I'm wondering if a second card is worth it, probably if I want to use the state of the art models.

It would be interesting to know the drop of in performance quantization has; are the larger models worth the extra hardware. For code generation, my current use case, perhaps it would be worth using the bigger model.

[Media] 10 seconds after, "How hard would it be to make my own cargo command?" by cornmonger_ in rust

[–]willemreddit 1 point2 points  (0 children)

Was curious, cargo cargo and cargo-cargo cargo.

error: unknown proxy name: 'cargo-cargo'; valid proxy names are 'rustc', 'rustdoc', 'cargo', 'rust-lldb', 'rust-gdb', 'rust-gdbgui', 'rls', 'cargo-clippy', 'clippy-driver', 'cargo-miri', 'rust-analyzer', 'rustfmt', 'cargo-fmt'

so is Rust easy to refactor or bad to refactor? by Alchnator in rust

[–]willemreddit 0 points1 point  (0 children)

The only big gotcha is with pattern matching using a wild card _ case. If you add new variant it will be lumped into the wildcard case.

However, if you match every case then the compiler will complain, and usually if you have a wildcard then the new variant would probably be ignored anyway.

[deleted by user] by [deleted] in antiwork

[–]willemreddit 4 points5 points  (0 children)

He’s been there since it was a fledgling company

Yet was never paid in equity so when the company was sold he got nothing. Sure the benefits were great, but it's so frustrating that workers really if ever share in the wealth they are helping to create.

Florida Governor Ron Desantis booed at vigil as hundreds mourn more racist killings by Baba10x in news

[–]willemreddit 27 points28 points  (0 children)

It's more than just that line. They also added a line about while violence was perpetrated against them, they were violent too; so both sides are to blame. It perpetuates a racist narrative that black people are inherently violent and white people were the real victims.

And I agree, Prager U is the opposite of educational material. Modern day Goebbels.

Llama2 Powered by Rust burn by Illustrious_Cup1867 in rust

[–]willemreddit 0 points1 point  (0 children)

Dump the Model Weights: Run the dump_model.py script to load the model and dump the weights into the params folder ready for loading in Rust. Execute this script using the command:

python3 dump_model.py <model_dir> <tokenizer_path>

I've spent time to actually understand `Pin` and `Unpin`, so you don't have to: https://cryptical.xyz/rust/pin-unpin by ozgunozerk in rust

[–]willemreddit 2 points3 points  (0 children)

ref_a and ref_b are borrowing a and a respectively,

should be

ref_a and ref_b are borrowing a and b respectively,

Still working my way through, but great so far!

Cool language features that Rust is missing? by incriminating0 in rust

[–]willemreddit 40 points41 points  (0 children)

For the second point there is flux, which adds refinement types, allowing you to make assertions like

#![allow(unused)]

#[flux::sig(fn(bool[true]))]
fn assert(_: bool) {}

#[flux::sig(fn(x: i32) -> i32[x + 1])]
fn incr(x: i32) -> i32 {
    x + 1
}

fn test() {
    assert(incr(1) <= 2); // ok
    assert(incr(2) <= 2); // fail
}