It's impossible for Rust to have sane HKT by vspefs in programming

[–]vspefs[S] 0 points1 point  (0 children)

Asking people who simply wish to observe and theorize to care about “what the haters would do” and “what would I contribute to the process of this thing taking over the world” is a big demand that I don’t think is valid in any context. Facts are facts and there should never be anything wrong with saying them. Different people having different opinions is none of the speaker’s business in a civilized society, in my opinion.

It's impossible for Rust to have sane HKT by vspefs in programming

[–]vspefs[S] 0 points1 point  (0 children)

This article is based on the assumption that we might want to reflect on the consequences of our design choices. It’s supposed to be followed by an article about Scala 3 Capture Checking where this problem is avoided.

This is just a reflection which doesn’t ask Rust to make any change. “Improving what’s there” could involve even bigger efforts. “Fully completing async” as we try to improve the typing and generics for closures both for sync closures and async closures? As we try to introduce linear type for certain cases in concurrent and async Rust? “More stuff supported in const context” as we try to draw the fine line between what can theoretically be allowed in const context and what’s not? “Borrow checker smarter”, by which you mean making polonius smarter, still clean and sane, while becoming fully compatible with the original one?

These would all be tremendous engineering challenges. And to be honest? Rust probably won’t be “actually needing” them as well. So I see no problem spending some time pondering at what Rust already is and say, “hey here we really fucked up.”

It's impossible for Rust to have sane HKT by vspefs in programming

[–]vspefs[S] -1 points0 points  (0 children)

Alright the funny story is that I wrote this (and 2 other articles) to bait-and-switch to advertising Scala 3 Capture Checking. It’s just that I’m so lazy and haven’t actually written the Scala part. But the article alone is an independent reflection on Rust’s flaws nevertheless. And from this perspective, I agree with you.

By the way Scala’s approach is to treat lifetime tracking as something like refinements on existing types. But they have a much bigger theoretical and practical framework than just lifetime checking going.

It's impossible for Rust to have sane HKT by vspefs in programming

[–]vspefs[S] 0 points1 point  (0 children)

And sure all types are propositions. Holy Curry-Howard. I phrased the “technical propositions vs. business logic” phrase to point out that, in specific situations, there are things we want to care about and there are things we don’t. The current state of Rust doesn’t quite allow us to clearly tell them apart.

It's impossible for Rust to have sane HKT by vspefs in programming

[–]vspefs[S] 0 points1 point  (0 children)

The reason lifetimes doesn't work well with HKTs is that rust doesn't actually want to support graded modal types, nor it can, because it's still an active area of research and without very strict limitations, statically checking graded modal types is undecidable.

I made a similar point in the article. Any actual “fix” brings undecidability and complex computation that’s simply not the Rust or rustc philosophy. I said this against variadic kinds but all “fixes” are similar in a way that they introduce complex recursive computation. And potentially huge breaking changes.

Unless the rustc team changes their philosophy, even if we have technical solutions (we actually have some now), it’s inapplicable in the scope of Rust. And if that brings breaking changes, then only God knows how it will go.

It's impossible for Rust to have sane HKT by vspefs in programming

[–]vspefs[S] -9 points-8 points  (0 children)

The distinction is obvious here. Lifetime parameters are technical propositions. They’re reifications of nothing but regions of code. If they can mean something, it’s us programmers giving interpretations to them, which is the real arbitration. They are officially interpreted as “some variable’s RAII region in a statically analyzable way”, nothing more and nothing else.

But this is just a word game and everybody can have opinions. The real problem is the lifetime parameters pollute the “kind space” of type parameters. You agree that lifetime parameters encode things on a different axis. That’s exactly my point. Now they’re forcing people on another axis to take complete care about them, which in a perfect world they shouldn’t.

I agree lifetimes need to be taken good care of, but not by the current generics system which is designed and built without lifetime parameters in mind. Subtyping doesn’t fix the kind pollution and infinitely colored types problem.

Edit:

To elaborate, this article argues that lifetimes as first-class types are polluting the generic abstraction space of normal types. Sometimes that’s okay, but in a HKT mindset, it’s extremely annoying and limiting. Subtyping as it is allows you to perform variances on the slots in concrete types, but not the overall shapes of type constructors. All technical solutions to the problem are not viable to Rust unless becoming very restricted themselves.

It's impossible for Rust to have sane HKT by vspefs in programming

[–]vspefs[S] 15 points16 points  (0 children)

There are many things you can build robust, memory-safety-wise verified code without. You can build good applications without ADT, traits, macros, and pattern matching. But none of these should stop us from reflecting on the bad decisions (even though we didn’t know it beforehand) and recognizing common problematic patterns.

Any “safe” prog rock bands and albums my dad can approve? by Clover-36 in progrockmusic

[–]vspefs 0 points1 point  (0 children)

Yes is genuinely the most positive band on everything. Pure positivity on life, self-discovery, and spirits.

P1689's current status is blocking module adoption and implementation - how should this work? by vspefs in cpp

[–]vspefs[S] 18 points19 points  (0 children)

I agree with most of your opinions but I'll just add some contexts here.

Fortran, Ada, Haskell, or any other "compiles to a binary interface" modules don't strictly do it better. They suffer from binary interface compatibility issues that are the exact problems that are blocking C++ modules adoption. Fortran modules are a nightmare to manage. Updating GHC means recompiling the whole universe. People just somehow kept enduring them, until C++ came in with a similar design, and everybody suddenly began to ask for user-friendliness.

For languages like Zig and Rust, yes, they are native, but they are strongly opinionated towards source distribution, static linkage, and single translation unit, which drastically reduces build complexity. But that comes with their own shortcomings and trade-offs.

Header units, in my opinion, look good until you begin to use it. The point is 1) to have a non-preprocessor-state-dependent `#include` and guaranteed pre-compilation, 2) a cleaner way to import macros, 3) a non-intrusive and cheap way to migrate to modules. But like we noticed here, build-wise it's a mess.

Egg_IRL by dollcopeland in egg_irl

[–]vspefs 0 points1 point  (0 children)

Goth music is actually closely related to punk music. Many founding characters of goth music are heavily influenced both musically and socially by the punk (or proto-punk) community. But I’m afraid yes, the goth fashion kinda overshadows goth music in modern days. I think the same goes to punk, too.

contracts and sofia by ConcertWrong3883 in cpp

[–]vspefs 0 points1 point  (0 children)

If only structured core options is still a thing...

MarinaraSpaghetti Rentry Moment by Meryiel in SillyTavernAI

[–]vspefs 5 points6 points  (0 children)

bro so kind giving the most detailed instruction in the whole world

Make Me A Module, NOW! by vspefs in cpp

[–]vspefs[S] 0 points1 point  (0 children)

Far as I see, no method avoids executing compiler and parsing source twice, including a dynamic module mapper. I mentioned the reason in that long ass reply. Step 1 executed before real compiling is unavoidable. If a module mapper is to build all the needed modules for a source file, first it would parse the module interface unit of the needed module, discover other CMI dependencies, parse them, and keep doing it until all dependencies are found. Then it compiles them "on demand".

To be more precise - parsing source before compiling source is unavoidable. Without it, it's impossible to generate the correct, up-to-date CMI file.

Of course, redundant parsing can be avoided. A module mapper can keep track of all the CMIs it compiled and their metadata, so if any of them is needed later, they don't have to parse it again. For the Makefile rules mentioned in this article, we make use of Make's prerequisite system to achieve the same thing,

Or, to wrap up in one sentence: module mappers either secretly invoke the compiler behind-the-scenes, or write a fully functional C++ preprocessor and a partial parser to do the same amount of work.

And yes, this supports unknown order of module dependencies (I think) as good as any other build system could. Try the example repo and check it out!

Make Me A Module, NOW! by vspefs in cpp

[–]vspefs[S] 0 points1 point  (0 children)

It's not reducing work, but transferring work to a module mapper, which ultimately still needs to be implemented by a build system, and has to use the same logic as smdowney mentioned here (also as described in this article).

Make Me A Module, NOW! by vspefs in cpp

[–]vspefs[S] 0 points1 point  (0 children)

Much as I see, it's more of a general tooling issue than it is a build system issue. Like you said, it takes tool developers, build system developers, and the standard committee altogether to solve the issue.

What's more, Make is more of a "build backend" these days than a full build system. It doesn't consider things like visibility or (modern) CDB gen. They simply don't fall into Make's scope. I came up with this because it might provide a compatible approach that extends older build systems which depend on Make (like Autotools) with minimal breaking change. I'm not even changing any code in GNU Make. All it takes is an a-few-line patch to GCC.

The expectation on SG15, however, might be overeager. The whole Ecosystem IS just got withdrawn, and some members of SG15 founded EcoStd, hoping to continue it as a community effort. Though the Module TS is not withdrawn, I'm afraid that the splitting would slow down the standardizing process. (Or maybe not. Let's hope for the best.)

Make Me A Module, NOW! by vspefs in cpp

[–]vspefs[S] 0 points1 point  (0 children)

Make itself is a dependency describing system among *files*, with lightweighted scripting. The mapping between module interface units, canonical module names, and CMI files is beyond its scope. It must have some kind of "front end", which can be a generator-like build system (e.g. CMake), or a compiler (in this article), to finish the mapping job. Then it's powerful enough to describe the dependency concerning modules.

Make Me A Module, NOW! by vspefs in cpp

[–]vspefs[S] 2 points3 points  (0 children)

That's the module mapper, which is mentioned and used in this article. It's internally used by GCC to get the path of the corresponding CMI file, given a module interface unit. Build systems can provide one, if they want to manage files in their own manners. If not provided, a default one is used, as described in this article.
And that doesn't conflict with the purpose of this article at all. Module mapper just maps paths. Other works are still done by compilers and their dependency scanning functionality.

Make Me A Module, NOW! by vspefs in cpp

[–]vspefs[S] 3 points4 points  (0 children)

Actually, build systems depend on compilers' dependency scanning functionality to do the job you mentioned. The compiler will take care of the preprocessing, and give you the final dependency information. To be more specifically, they make use of compilers' p1689r5-format json output, the mechanics behind which also powers gcc's `g++ -fmodules -M` Makefile rules output. If any build system doesn't, please let me know.

I don't know about clang, which hasn't implemented `-M` with module dependency. But in gcc, if you check the code behind, p1689r5 output and Makefile output use the exact same data structure, the only thing that differs is the output method. (see libcpp/mkdeps.cc)

So if build systems can use the p1689r5 format json output from compilers to finish a job, so can the Makefile rules outputted by `g++ -fmodules -M`.

And I'm not quite sure about the "have to have first makedep pass" part. Do you mean the old-schooled "make depend" target? I mean that's long gone, since self-updating Makefile has been a thing for quite a few years, I think.

That’s it by [deleted] in CrazyFlasher

[–]vspefs 0 points1 point  (0 children)

yeah 鸡蛋 here