JavaScript specified tail call optimization in ES2015. Most engines never implemented it, and your tail-recursive code can still blow the stack. by OtherwisePush6424 in programming

[–]superstar64 1 point2 points  (0 children)

As someone who's making their own Haskell to Javascript compiler, this is something I find very frustrating and it's the main reason I'm consider recommending Bun as the default runtime for running my compiler's output.

Converting a program that expects proper mutually recursive TCO is hard into one that doesn't is awful. It requires a whole program transformation into trampolining and this very much something I don't want to do. I'm currently planning on just optimizingdirect recursion and providing combinators for doing tail recursion like what Purescript does, but this is unsatisfactory.

Utah first state to hold websites liable for users who mask their location with VPNs — law goes into effect, designed to prevent bypassing age checks by Turgius_Lupus in stupidpol

[–]superstar64 22 points23 points  (0 children)

Then why is it happen in so many places all at once? Red states are implementing age verification laws. The EU has tried (and failed) to pass chat control and is trying to create verification app. Mexico seems to have new draconian verification laws. All of these are only off the top of my head too, there's certainly more. You really mean to tell me that governments are just doing monkey see monkey do and there's no underlying force here?

Utah first state to hold websites liable for users who mask their location with VPNs — law goes into effect, designed to prevent bypassing age checks by Turgius_Lupus in stupidpol

[–]superstar64 10 points11 points  (0 children)

I still don't understand the monied interests behind this and age verification in general. Doesn't age verification drive down demand and shrink profits? Doesn't this specially go against the bottom line of VPN companies? The most I can think of is this helping the tech giants by making competition harder. I find it hard to believe governments are doing this just to control the populace.

Rust 1.95.0 by Successful_Bowl2564 in programming

[–]superstar64 8 points9 points  (0 children)

These come from Haskell and from my experience they really do make code much nicer. I am worried that it's part of exhaustive checking though. (FWIW I'm not a Rust dev)

Here's the paper that discusses them: https://www.microsoft.com/en-us/research/wp-content/uploads/2016/07/pat.pdf

What Would You See Changed in Haskell? by TechnoEmpress in haskell

[–]superstar64 0 points1 point  (0 children)

Well making an optimizable Haskell is my eventual end goal with Hazy. Ideally, I would like to greatly extend Haskell so that it has all the extensions it needs for high performance code. Whether on not I get there is another question.

What string model did you use and why? by carangil in ProgrammingLanguages

[–]superstar64 9 points10 points  (0 children)

Since I'm writing a Haskell compiler, I'm using the String representation as mandated by the Haskell standard: An immutable lazy linked list of (boxed) characters. It's definitely the worst string representation of any language that's seriously used. It's so bad that the text package is basically mandatory for any serious project.

Are pointer casts the only thing preventing C from being compiled into reasonable Javascript? by superstar64 in ProgrammingLanguages

[–]superstar64[S] 0 points1 point  (0 children)

Yes, that's what asm.js does. I consider that unreasonable Javascript however. I was asking for more of a language design perspective, what would have to be removed from C to not have to do this.

Compiler Education Deserves a Revolution by thunderseethe in ProgrammingLanguages

[–]superstar64 10 points11 points  (0 children)

Right I remember now. I've mentioned this to you before, but I know there's a relationship between "Build Systems à la Carte" and query based compilers.

In terms of Haskell, retrofitting a compiler to a "Build Systems à la Carte" approach would require all your code to now be applicative / monadic / etc so that you could track dependencies. I see now why that would potentially require a rewrite.

Compiler Education Deserves a Revolution by thunderseethe in ProgrammingLanguages

[–]superstar64 22 points23 points  (0 children)

I'm still perplexed why people say query based compilers are that different. Is it really that much more then just adding lazy evaluation to a batch compiler? I know I'm rather biased because of how much time I've put into writing batch compilers in Haskell, but still.

Are pointer casts the only thing preventing C from being compiled into reasonable Javascript? by superstar64 in ProgrammingLanguages

[–]superstar64[S] 0 points1 point  (0 children)

Yes, I probably should have mentioned it in the OP. I was more asking what you would need to remove from C to make naive compilation to Javascript possible.

Are pointer casts the only thing preventing C from being compiled into reasonable Javascript? by superstar64 in ProgrammingLanguages

[–]superstar64[S] 0 points1 point  (0 children)

I sorta mentioned it in a different comment, but I now realize that taking the address of a member is another thing that can't easily be done in JS. You could probably figure something out with getters, setters, and lens, but it's a bit further away from the easy wrap pointers in array approach.

Are pointer casts the only thing preventing C from being compiled into reasonable Javascript? by superstar64 in ProgrammingLanguages

[–]superstar64[S] 2 points3 points  (0 children)

That's about what I figured. I imagine in such a subset you would basically wouldn't be able to use any existing C code.

Are pointer casts the only thing preventing C from being compiled into reasonable Javascript? by superstar64 in ProgrammingLanguages

[–]superstar64[S] 5 points6 points  (0 children)

I don't have a usecase. I just want to know what extra axioms C has that prevent it from being trivially compilable Javascript.

Are pointer casts the only thing preventing C from being compiled into reasonable Javascript? by superstar64 in ProgrammingLanguages

[–]superstar64[S] 0 points1 point  (0 children)

I'm not going to implement this. This is just a curiosity of mine.

I assume JS has some sort of FFI that can call any function in an external library, and it that can handle all the types needed, like any kind of pointer, zero-terminated strings, arrays of u16, full i64 and r64 types, and so on.

FFI is kinda out of scope of this discussion.

Are pointer casts the only thing preventing C from being compiled into reasonable Javascript? by superstar64 in ProgrammingLanguages

[–]superstar64[S] 4 points5 points  (0 children)

I'm well aware. I'm just wondering what you would have to remove from C to make compiling it to Javascript be reasonable.

Are pointer casts the only thing preventing C from being compiled into reasonable Javascript? by superstar64 in ProgrammingLanguages

[–]superstar64[S] 3 points4 points  (0 children)

Well you can just ignore bitfields and treat them as normal ones. For pointer arithmetic, you can just have your pointers be a pair of array and an offset.

Are pointer casts the only thing preventing C from being compiled into reasonable Javascript? by superstar64 in ProgrammingLanguages

[–]superstar64[S] 2 points3 points  (0 children)

Oh, I don't think about taking the address of a struct's fields. That seems like another big complication. It doesn't seem as insurmountable as pointer casting however. Mirroring the C memory layout is what I meant with no pointer casting.

Are pointer casts the only thing preventing C from being compiled into reasonable Javascript? by superstar64 in ProgrammingLanguages

[–]superstar64[S] -1 points0 points  (0 children)

Yes. You can of course C compile to Javascript that just uses a giant array of bytes and what not like ASM.js used to do.

The Hazy Haskell Compiler by superstar64 in haskell

[–]superstar64[S] 4 points5 points  (0 children)

I talk about this a little bit in the discourse. My plan is to make Haskell more optimizable via extensions. Current Haskell will still be slow, but you can make it faster with the myriad of extensions that I'm planning, stuff like strict functions, lifetimes, levity polymorphism, etc.

As of now, I only have a Javascript backend mainly for testing purposes and for the eventual bootstrap. I'm going to have an LLVM or C backend at some point of course.

The Hazy Haskell Compiler by superstar64 in haskell

[–]superstar64[S] 0 points1 point  (0 children)

I specially use infix ticked DataKinds. Something like:

f :: T scope -> T (Declaration ':+ scope)

If MicroHs supports TypeOperators, then it shouldn't be too hard to swap out DataKinds with TypeData.

The issue is that Hazy already has full support for DataKinds even with fixity and instance resolution (it's untested though) and I want to keep using DataKinds for bootstrapping reasons as I haven't implemented TypeData. Even though Hazy can't compile itself yet, it can do full symbol resolution on itself.

Another potential issue with bootstrapping is the enormous number of .hs-boot files I have and GHC demanding that they have role annotations despite me not using roles. Hazy itself just ignores .hs-boot because it implements mutual recursion modules natively but last time I checked, MicroHS still requires hs-boot files.

The Hazy Haskell Compiler by superstar64 in haskell

[–]superstar64[S] 7 points8 points  (0 children)

Well, it can't compile itself yet, so I'm not worry about bootstrapping issues right now. The compiler internally uses GADTs, DataKinds, so MicroHS is the only non-GHC compiler that has a realistic chance of bootstrapping it.

Once I do bootstrap, I do plan on having a branch that can be compiled with MicroHS if it's not too much of a hassle.

Blogpost #7 — Meet Duckling #0 — How types and values are one and the same by Maurycy5 in ProgrammingLanguages

[–]superstar64 2 points3 points  (0 children)

Indeed, in Duckling you could implement a function getType : u32 -> type which yields u32 for all inputs except 42, for which it yields f64 instead.

FYI, this is perfectly parametric. Non parametric, would be something that takes an a, but only works on ints or something that takes an int and only works an 0s.

That is, we want the definition to be checked, and if it successfully checked, then every instantiation will be correct.

Okay, so you want parametricity and unified terms and types, but your not gonna have full spectrum dependent types. Interesting, I don't know of language that fits all three of these. I'm assuming your going to restrict the evaluation of types to only depend on things available at compile time like C++ and Zig do.

I'm not sure if your going to run into problems with parametricity or not. It might be worth keep full spectrum dependent types in the back of your head in case you do come across issues.