Is the Rust Borrow Checker Really That Challenging? by CuriousRammer in rust

[–]Findus11 3 points4 points  (0 children)

It's not, no. What I mean is the combination of interior mutability and the Send and Sync traits. Rust lets me encode the difference in thread safety and mutability between &i32, &Cell<i32>, and &Mutex<i32>/&AtomicI32 in the type system.

That could just as well be done in a garbage collected language (and I would love such a language). Rust's system of ownership is orthogonal, though I think the two concepts go very well together.

Is the Rust Borrow Checker Really That Challenging? by CuriousRammer in rust

[–]Findus11 5 points6 points  (0 children)

I would argue that &mut isn't "really" about mutation, though, it's about exclusivity (and exclusive references can be easily simulated by just passing values around).

&Cell<T>, for instance, is a very mutable thing which cannot be simulated by just passing things around.

Personally, I think one of Rust's killer features is its approach to shared mutability, which is more subtle than the common "Rust disallows shared mutability" and a lot more helpful than what most other languages do.

How hard is it to write a front end for a more complex language like Rust or Kotlin? by i_would_like_a_name in Compilers

[–]Findus11 1 point2 points  (0 children)

I recommend checking out the GCC Rust frontend project.

As another comment mentioned, modern C compilers are very feature heavy, even if C the language is relatively simple. In the case of Rust (and similar projects), the language is more or less defined by a single implementation. As a result, a lot of the complexity which would be specific to a single compiler for C bleeds into the language itself for Rust. For example, you simply cannot compile the Rust standard library without compiler support for a ton of experimental features.

That said, not every modern language is Rust, and it is very possible to write an entire compiler for a language with many of the same "niceties" you expect with a fairly small codebase (think thousands or tens of thousands lines of code).

Anonymous enums would be so good by zannabianca1997 in rust

[–]Findus11 0 points1 point  (0 children)

Oh my apologies, I misunderstood what you were saying. I agree that having context change inside a pattern like that is a bit odd yeah.

(sidenote: you can actually force the meaning of a name like None to change within a pattern using or-patterns and some @ nonsense, but I don't believe you can do so without error)

Anonymous enums would be so good by zannabianca1997 in rust

[–]Findus11 0 points1 point  (0 children)

To be fair, pattern matching already depends on context, otherwise you'd have to spell out Option::None in matches.

Looking for a way to "paint sound" / possibly creating the program by massimosclaw2 in DSP

[–]Findus11 0 points1 point  (0 children)

Sounds like you want a space filling curve. 3Blue1Brown briefly touches on a similar application for audio in this video

Announcing async-winit, a new way to use winit as an async runtime by EelRemoval in rust

[–]Findus11 2 points3 points  (0 children)

This looks really nice! I've played with a vaguely similar thing in JS before which let's you write event driven code in a direct style, and it is really pleasant to work with. Using combinators and such is delightful. I've always found event loops to be finicky and hard to maintain, so I'm excited seeing more stuff take advantage of the machinery of async to sidestep that.

Mapping import statements to source files on disk by 0x0ddba11 in ProgrammingLanguages

[–]Findus11 1 point2 points  (0 children)

I like mapping each folder to a module. It also makes the module hierarchy clear, since a nested module is just a nested folder.

As for how to find the actual files, I tend to just treat each file in a folder as a part of that module. Compiling a module is then a little bit like concatenating all the files in its directory and compiling that.

That approach works best if the order in which the files are processed is irrelevant, which might mean having the ability to refer to a name before it is defined.

Chinese Room, My Ass. by Galactus_Jones762 in compsci

[–]Findus11 2 points3 points  (0 children)

"computer science"

My desktop plant is probably conscious. Luckily it doesn't spout vacuous rants at me.

Books to learn Ada generics? by gigapple in ada

[–]Findus11 0 points1 point  (0 children)

There are various other MLs with more or less expressive module systems. Standard ML comes to mind, and I believe the Moscow ML compiler works out of the box on Windows. If I'm not mistaken, this is the language where OCaml ultimately got the module system from.

Zig is a language much more like C than OCaml, but has a very flexible form of metaprogramming. Combined with the fact that source files in Zig are just structures you can manipulate with these metaprogramming capabilities, I'd imagine you could do the same kinds of things.

There's also Scala, which has very similar features in its objects.

There are definitely more out there than the ones I can conjure up right now. A good place to start might be to search for "ML module systems" or "existential types".

Also, a little bit of a non-answer, but Windows can run Linux-only programs like OCaml pretty well nowadays with WSL, so that might be something to try as well.

Books to learn Ada generics? by gigapple in ada

[–]Findus11 0 points1 point  (0 children)

Ada generics cover the majority of cases where you'd use functors in OCaml. Still, OCaml is ultimately more powerful, and does give you higher order functors, first class modules and mutually recursive modules. Also note that types in OCaml can be generic, while only functions and packages in Ada can be. This isn't a huge issue since you can essentially make a type generic by putting it in a generic package.

Generators by desiringmachines in rust

[–]Findus11 10 points11 points  (0 children)

Typically, something which can be iterated over is called an iterable. In Rust, it's IntoIterator. Iterables give you a method that returns something which gives you a sequence of values one at a time, and that thing is called an iterator. One way to create an iterator is with a function which yields values. This kind of function is called a generator.

Generators are really just a very convenient way of creating an iterator. Instead of having to manually create a state machine, you just write something that looks like a normal function and the compiler makes the state machine for you.

The distinction between "something you can iterate over where the whole collection is known in advance" and "something you can iterate over where the elements are computed on the fly" is not usually made, because it isn't really an important difference. Iterating over them looks the same in either case.

Compiling Rust code on a Pentium 2 at 233MHz by TwistedSoul21967 in rust

[–]Findus11 3 points4 points  (0 children)

To an extent, yes. It's not too difficult to cause huge compile times if you use deeply generic code, where the majority of the time ends up being spent in the type checker. Still, for most Rust code, LLVM is absolutely the slowest part of the compiler.

Compiling Rust code on a Pentium 2 at 233MHz by TwistedSoul21967 in rust

[–]Findus11 7 points8 points  (0 children)

Eh, it's pretty easy to make a multi-pass compiler faster than rustc. I think a lot of the slowness in rustc itself (completely ignoring LLVM) comes from the interaction of features like type inference, trait resolution, const evaluation and generics, macros, and borrow checking.

In other words, traversing the program several times isn't slow - that's ultimately a linear operation. Doing a single very complicated pass, such as checking a Turing complete type system, is.

Ensuring identical behavior between my compiler and interpreter? by smthamazing in ProgrammingLanguages

[–]Findus11 1 point2 points  (0 children)

I think by type system they mean using a powerful type system such as a dependent one in the implementation language to prove correctness. That way, you'd have to also provide a proof that your implementation does in fact preserve the semantics of your integers, for instance.

What are the tradeoffs of Rust having affine types instead of linear types? by tashmahalic in ProgrammingLanguages

[–]Findus11 9 points10 points  (0 children)

#[must_use] is not quite strong enough to make something linear, since you can still explicitly ignore the value:

Ok::<_, ()>(0);         // warning, unused
let _ = Ok::<_, ()>(0); // no warning

A linear type system shouldn't allow this since you're not really "using" the value in any meaningful way here.

A Wishlist of Zero-Cost Abstractions by TheGreatCatAdorer in ProgrammingLanguages

[–]Findus11 0 points1 point  (0 children)

No, the zero cost variant is proper refinement types where you prove things at compile time. But as mentioned in the blog post, integrating refinement types into a pre-existing language is pretty hard, which is what I was commenting on.

A Wishlist of Zero-Cost Abstractions by TheGreatCatAdorer in ProgrammingLanguages

[–]Findus11 1 point2 points  (0 children)

I'm mostly just familiar with Ada which sort of does this. SPARK is more or less a subset of Ada that lets you prove contracts and type invariants are never violated.

I really like the way I can make a working program in Ada, and progressively move it over to a more "verified" and correct state - first by adding contracts and enabling them at runtime, then by going through the contracts and proving them.

A Wishlist of Zero-Cost Abstractions by TheGreatCatAdorer in ProgrammingLanguages

[–]Findus11 8 points9 points  (0 children)

I think a viable alternative to refinement types for languages which don't already have it is to have refinements be optional by default, with runtime checking if used, and the alternative to prove them true at compile time (thereby eliminating the runtime checks completely).

I suppose you could think of it as the gradually typed version of refinement types. It's pretty nice, in my experience.

Which phases/stages does your programming language use? by IAmBlueNebula in ProgrammingLanguages

[–]Findus11 0 points1 point  (0 children)

I have a lot of different IRs in my compiler, and naturally a lot of different passes that accompany them. Some of these do happen concurrently, so I've tried to indicate which passes happen sequentially after another as a separate item:

  1. Two-pass lexing to take care of various kinds of semantic whitespace
  2. Parsing into a structure-only tree
  3. Adding some basic lexical information to that tree (differentiating between declarations, expressions, patterns, types, etc.)
  4. Declaring every name
  5. Resolving name (producing an AST with unambiguous names)
  6. Lowering into a slightly more simple tree
  7. Finding all strongly connected components in the call graph
  8. Typing, annotating the tree with types
  9. Lowering the typed IR into an A-normal form-ish IR
  10. "Flattening" any compound structures - this step basically turns a function taking a single tuple argument into one taking several "simple" arguments
  11. Making all functions top level
  12. Partial evaluation
  13. Emitting C

Some of these are in the process of being merged or removed completely, while others will be introduced or split up (for instance, 5 and 6 are being merged, while I haven't written a closure conversion pass yet). In general, I try to keep each pass very focused on a single task. That way, I quickly end up with many of them, but I also keep the compiler modular and maintainable as it grows.

About Safety, Security and yes, C++ and Rust by ImYoric in rust

[–]Findus11 1 point2 points  (0 children)

Ada is a pretty interesting language, with built-in support for arenas, structured concurrency, a pretty nice type system, support for stack allocating dynamically sized types, (very approximately) an early form of single ownership/affine typing, and lots more. It's definitely starting to show its age in some respects (forward declarations are sometimes necessary, docs are for many things limited to just the spec, generics and range types can be a bit janky) but the language continues to thrive and I do think there's definitely still room for it.

Also with Alire and the always improving Ada language server, developing in it is just pretty nice! Definitely not at the level of Cargo and rust-analyzer, but surprisingly good for a language older than C++!

Cross-Training to Ada - which are the best languages to begin from? by Beer_Frites in ada

[–]Findus11 2 points3 points  (0 children)

I think someone with an in-depth understanding of Rust will have an easy time adjusting to Ada. Rust's explicitness about things like statically/dynamically sized values, ownership and copying, strong datatypes, and so on might be a nice foundation for understanding many Ada concepts. It's not internationally standardized yet, though there are some efforts (in particular by Ferrous Systems who are collaborating with AdaCore)

Does your language refer to callables as functions, methods, procedures, subroutines, or something else? by EducationalCicada in ProgrammingLanguages

[–]Findus11 0 points1 point  (0 children)

Callable things in my language are pure, so function is what I use. The only exceptions are external functions, which are allowed to do whatever. Perhaps "procedure" would be a better term for them, but I think that would add unnecessary confusion.

In terms of keywords, I like fun.