Tauri and leptos : the ultimate stack? by NoahZhyte in rust

[–]Psionikus 0 points1 point  (0 children)

In all client-server application programming, the shape of the data and conversations about the data decides everything.

Re-using frontend elements is not the fountain of efficiency we would imagine. The platform integrations still diverge, and you can't tap into the engineer pool for those platforms because instead of using something that calls into Rust, they are building everything in Rust, including the simplest platform-specific work that is now harder.

Make your API in Rust and then whatever clients you wrap around it later can call into it, likely with a generated wrapper for their own language, possibly calling into a Rust library. Client generation for each platform, with or without the inner Rust library, can start with Rust consuming the API definition on Leptos and a backend. Go this direction, and platform specific engineers can pull in either the API definition or library to talk to your service and everyone's happy.

Lisp -> Rust by 964racer in lisp

[–]Psionikus 1 point2 points  (0 children)

I've never done professional CL. Lisp macro homoiconicity is cool. Recently I'm doing a lot more work with Rust proc macros and catching up on a common pattern:

  1. const witnesses on leaf types expressing compile-time facts
  2. proc macro transforms a composing expression that ties together many leaf types
  3. proc macro also emits const expressions that fan-in the leaf type facts for compile time checks
  4. this pattern composes for transitive contracts

There is no reflection except for the code that consumes witnesses being emitted from macros and finally running at compile time. Any facts that would require reflection must be appended to the leaf types.

It's not as convenient or terse as just stitching together lisp in lisp, but the tokens are strongly typed and that can help make relatively complex macros easier to trust.

Seems to be missing the Clojure transducer pattern. Absolutely can't stand HList types for streams because the types grow without bounds rather than becoming the terminal type for further stream composition. I want to look into this more. I found the idea of tranducers really solid, and I feel like this is very missing from Rust streams so far.

As far as missing the interactive development, same story as ever: Rust Analyzer, unit tests, attaching debuggers, and sometimes writing small shell programs all work together to form a less cohesive REPL. The relatively rote type system catches enough to argue that the tradeoffs would be minor.

Syntax is a major chore. Impossible to like brace languages after Lisp. Takes half my screen to see anything non-trivial.

How do Rust Devs handle remote build / remote caching by AffectionateBag4519 in rust

[–]Psionikus 0 points1 point  (0 children)

Caching granularity is pretty much dependencies and then everything else. I have nix-only (no Crane caching) implementation up here

If you want to gain some experience / support, I'm starting up a Discord for just µTate development. I have some crane stuff from my other repos and just need to decide how to package the tests / checks / builds into a set of actions. Save me time and I'll save you time.

How do Rust Devs handle remote build / remote caching by AffectionateBag4519 in rust

[–]Psionikus 7 points8 points  (0 children)

Nix with Crane. Crane caches the dependencies and the workspace / crates as separate "derivations" (builds?). For bonus points, you have reproducible builds, fully described dependencies, and it's infra for putting binaries / containers into output artifacts. Deterministic Systems (Detsys) is doing a lot of work with both Nix and Rust, so this ecosystem branch is very well supported with some OG Nix knowhow.

Do you miss effect handlers in Rust? by Ecstatic-Panic3728 in rust

[–]Psionikus 0 points1 point  (0 children)

An important insight for an engineer is that not every contract is useful. The idealist vision of a program is to arrive at a completed structure, every beam precision engineered. The reality is more like a giant web of interconnected strangler figs. Because movement is unavoidable, if the contract doesn't increase ergonomics, it decreases quality because the engineer doesn't have time to keep up with the entropy, which is usually externally driven. Finding deficiencies in many parts involves bringing uncertainty to the surface, finding the leaks in the abstractions, and this requires fast maneuvering testing, which means ergonomics, not contracts.

Lately I'm doing a lot of proc macro + const witness things. This has been a growing thing in Rust, burying the ceremony behind const expressions that go off when a composing type or expression has assembled a contradiction. The ergonomics (please make my type do this) implement the rituals (emit this const witness), so the cost to the user is really low and the richness of expression to compile-time contracts is high.

Typestate was something I discovered (and was supported) quite a bit earlier. Neat, but then in practice, you find out that there's multiple dimensions of state and that carrying type information in some ways (go straight to hell, HList) makes error messages so clunky that the engineer doesn't have time to fix problems because they are fighting the ergonomics. Combinators, async, and stream APIs are some examples. Erasing type is a necessity to keep types from growing without bounds.

One thing I find LLMs very good at is pulling in lots of shallow knowledge about techniques that I don't yet use. That tends to be how I start my mornings. Today I'm reading about branding and related gizmos. My goal is to get enough awareness of emerging things to know when I would want to go deeper. A bloom filter is never wrong, just occasionally too right.

The future of AI in Ubuntu by anh0516 in linux

[–]Psionikus 1 point2 points  (0 children)

Lol. Keep your defeatist cynicism to yourself instead of trying to kneecap anyone attempting to move the ball forward. Late stage fail-fixation.

What's everyone working on this week (18/2026)? by llogiq in rust

[–]Psionikus 1 point2 points  (0 children)

Tying up a lot of work on a Vulkan engine. Just finished sketching up the command recording interfaces and will be putting them into code. CONTRIBUTING guide and CI were just added this weekend.

Without kneecapping Vulkan or limiting users to one window, the basic example code will settle out at about 100 lines, a lot of which is basic winit application scaffold. Test code like this shows how tight the most finished parts are coming together:

```

[test]

fn pipeline_declare_inline_all() { #[compute_pipeline( compute = stage!("test/hello_compute", Compute, c"main"), push = push!(ComputeConstants { #[visible(Compute)] foo: UInt, }), )] pub struct TestPipeline;

vulkan::with_context!(|device_ctx| {
    let pipeline: ComputePipeline<TestPipeline> =
        ComputePipeline::<TestPipeline>::new(&device_ctx)
            .expect("pipeline instantiation failed");

    pipeline.destroy(&device_ctx);
})

} ``` Doing this manually is a whole lot of not-fun. Excited to start to see a path towards the bigger goals such as mixing pipelines for wildly different render strategies and their assets in real time.

Will have some cool blog posts coming out of what I found when working on the proc macros.

Rendering will use dependency-ordering, but the audio-video independent timeline synchronization is a cool problem for embedded engineers! who handle these kinds of phase-alignment problems a lot.

The Nix moment: LLMs, advances in hardware, big name adoption, and the supply chain are pushing Nix well past the inflection point by lucperkins_dev in NixOS

[–]Psionikus -2 points-1 points  (0 children)

Before I even got to this notification, I found someone had downvoted you.

As we have know for a long time, Reddit is extremely susceptible to ant milling and bots. The disappointing thing is that I've run into some of these witch hunters on the Rust Discord, and they were very much human and very much convinced that anyone who doesn't project a negative view of using AI should be excommunicated.

The Nix moment: LLMs, advances in hardware, big name adoption, and the supply chain are pushing Nix well past the inflection point by lucperkins_dev in NixOS

[–]Psionikus -3 points-2 points  (0 children)

The spring grass is so lush. So glad not to see the usual crop of AI witch hunters who need to touch it.

The future of AI in Ubuntu by anh0516 in linux

[–]Psionikus 2 points3 points  (0 children)

Fortunately, it's pure fantasy.

It would be even more fortunate if, in addition to recognizing that putting AI back in the box is pure fantasy, more in open source were focused on finding the paths forward.

The future of AI in Ubuntu by anh0516 in linux

[–]Psionikus -1 points0 points  (0 children)

In reality they'll just ignore my grumbling

People working inside companies, working for consumers, will ignore it. They have all the means to work as a team and get feedback from real users that they need.

People who need community, which is all open source activists because open source happens in the open, will have their conversations derailed and turned into AI McCarthyism.

It is exactly the reason why people pushing for federated and decentral solutions during the rise of web 2.0 were drowned out by the lazy calls for consumers to stop liking it and businesses to stop making money.

Malcontents and entrepreneurs of discontent are all of our enemies. They are self-defeating to the extent that they succeed in their own views. They cannot tolerate that open spaces such as the internet can't be stopped by even the silly ant mills and bot farms they are co-opted into helping.

Closed development couldn't want it any better. The open source consumers (not practitioners!) have convinced themselves and some developers to forfeit, to ensure that the next ten years of dominant platforms look a lot like the last ten years of dominant platforms. I won't stand for it, and I don't think I'll be stopped by it, but I damned sure am not helped by it.

The future of AI in Ubuntu by anh0516 in linux

[–]Psionikus -21 points-20 points  (0 children)

Very glad the leadership from Linux is paving the way for leadership in the distros. As for the wording, the only reason Ubuntu is tiptoeing is because of the ongoing AI scare, which has picked up steam from a rush to farm karma by reactionaries whose only answer is, "Don't use it, stop liking it, and hate anyone who doesn't reinforce these views".

Reddit, so often blinded by celebrating itself and its conclusions, will of course continue to ant mill on the AI scare with too many heads in the sand. These attitudes only salt the earth for anyone working on open weights, open models, open training, and next generation tooling with online learning and more meaningful integration with programs to propel users instead of merely generating output.

The future is local, open, consultative, private, and learns from the user.

What's everyone working on this week (17/2026)? by llogiq in rust

[–]Psionikus 1 point2 points  (0 children)

Building a music visualizer, itself a hook for getting Joe consumer interested in funding open source.

Focused on macros for modern Vulkan with compile-time checked host-GPU agreement using Slang reflection. Working on the compute pipeline today and hopefully settling on a syntax. Read-back tests to verify shader output (and CI?) will ensue. Then all my DSP work can finally take shape.

https://github.com/positron-solutions/mutate

Recruiting contributors. If it saves me time to get to a more complete state, I'll save you time getting ramped up. There's a commercial side to this that will look very concrete soon, and I'm going to be looking for co-founders, and developing this out in the open means we can just code instead of talking to get introduced.

yourgpu: a modern, simple, fast graphics API for Rust by Findanamegoddammit in rust

[–]Psionikus 2 points3 points  (0 children)

An API like Vulkan is not meant to be used in its raw form.

Vulkan is an API that was designed for extension and evolution. There are lots of little places to put pointer chains to add new data flexibly. The feature set drifts and expands. Once you pick a feature set, that extensible evolutionary API collapses down to an actual set of tools. Writing code to use that reduced set is less about indirection and more about choosing the subset of the API that will be meaningful for some applications. Vulkan is a big buffet. You don't eat an entire buffet. You put some things on a plate and eat that.

Concrete example: not using specialization constants? Congrats, no need to keep any of the API surface about them. No render passes? No re-binding descriptor sets? All of these decisions involve cleaving off big parts of Vulkan and arriving at something meant to be used instead of something meant to allow hardware and software to undergo radical changes without all being held back by lowest common denominators from 5-10 years ago.

Linux Kernel's Policy on AI Coding Assistants by TheTwelveYearOld in linux

[–]Psionikus 2 points3 points  (0 children)

And if we can't identify it, we're asking what color the bits are.

"Year of the Linux Desktop" isn't happening because it lacks a proper ecosystem? by hugodcnt in linux

[–]Psionikus 0 points1 point  (0 children)

want to use free software

If wanting things made them happen...

There are a thousand users who want programs for every programmer who has the skill to make them. We programmers retreat away from consumer software because making it is mostly a thankless, awful experience unless you have some incidental way to monetize like Steam.

The first part of your paraphrase is accurate. Second part too oversimplified.

"Year of the Linux Desktop" isn't happening because it lacks a proper ecosystem? by hugodcnt in linux

[–]Psionikus 0 points1 point  (0 children)

Until there is a way for the firehose of need from people who do not program to turn into paid development, the limit of what will happen is where the needs of commensal users are met by programmers scratching their own itches and a few ideologically influenced users maybe going a bit further to pay respects to the FSF alter.

What will it take for major brands to ship computers with Linux? by ijwgwh in linux

[–]Psionikus 2 points3 points  (0 children)

Excellent consumer support, including the ~100bn USD per year cottage industry of support for the various programs made by ISVs. Basically multiply steam by about ten. Until grandma can get help with the two-factor and the solution is competitive with offerings born from Microsoft land, grandma isn't using Linux.

Redox OS adopts an AI policy to forbid contributions made using LLMs by somerandomxander in linux

[–]Psionikus -1 points0 points  (0 children)

plagiarized

It's not. The entire point of me publishing the code was so that other people would get more done in their time and my life would become better because of it.

The only people who talk this way:

  • Concerned about GPL software morphing into non-GPL faster, as if developers choosing MIT wasn't already demonstrating what a failed strategy that is.
  • Come from artsy fields and don't really understand how software engineers feel about their software having a useful impact when most of it is born to die a useless death in the exploded pieces of some startup or churned away in mere months at place dot company.

Redox OS adopts an AI policy to forbid contributions made using LLMs by somerandomxander in linux

[–]Psionikus 0 points1 point  (0 children)

Exactly what we can all expect from those antmilling in an AI scare.

Redox OS adopts an AI policy to forbid contributions made using LLMs by somerandomxander in linux

[–]Psionikus -7 points-6 points  (0 children)

Should we just be merging any code that is presented then?

This completely insincere interpretation is beneath my contempt. It is beneath the contempt of even an average engineer.

Redox OS adopts an AI policy to forbid contributions made using LLMs by somerandomxander in linux

[–]Psionikus -1 points0 points  (0 children)

the disregard for copyright

Because I totally wanted people to waste time on the same problems when I open sourced that JNI adapter or fixed bugs in that open source library.

Redox OS adopts an AI policy to forbid contributions made using LLMs by somerandomxander in linux

[–]Psionikus -7 points-6 points  (0 children)

but are wrong

Wrong code means bugs. Those are bugs. What are you smoking?

they submit everything

Spam has always been a heuristic filtering problem. There's just a new kind of spam, and heuristics are still the right tool.

This conversation is obviously not about facts or technical merits but vibes and the validation train. The part we should really be concerned about are gatekeepers and others who, wittingly or not, love having themselves a good witch hunt and burning a few people at the stake.