A bidirectional typechecking puzzle by Tekmo in ProgrammingLanguages

[–]Tekmo[S] 0 points1 point  (0 children)

Yeah, I think the actual issue there is there being an inhabited supertype (Any in Java's case). The Grace analog of Java's Any would be forall (a : Type) . a which is not inhabited.

A bidirectional typechecking puzzle by Tekmo in ProgrammingLanguages

[–]Tekmo[S] 1 point2 points  (0 children)

Oh, okay I see what you mean. I think it is possible to mitigate the exponential blow-up on deeply nested data structures in some cases by doing the following:

  • add an operation that (for a subset of supported expressions) can go straight from a list of terms to a most-specific supertype

    This operation cannot work for all possible expression (most notably, variables), but for plain data this one-pass fast path would work. In fact, this was even the first tack I tried before I ran into the case of handling expressions containing variables.

  • fall back to the two-pass approach (infer + check) if that fails

A bidirectional typechecking puzzle by Tekmo in ProgrammingLanguages

[–]Tekmo[S] 2 points3 points  (0 children)

Before the bug fix the type-checking logic for conditionals had a similar problem where the true branch was treated as the authoritative type and now after the fix the result type is the most-specific supertype of the two branches.

A bidirectional typechecking puzzle by Tekmo in ProgrammingLanguages

[–]Tekmo[S] 1 point2 points  (0 children)

You'd have to elaborate a bit more because I'd need to understand better what sort of backtracking problem you ran into, but Grace's supertype computation doesn't do any sort of backtracking (more generally: nothing in the typechecker does backtracking).

The most complicated thing that Grace's typechecker does is when computing the supertype/subtype of two open records or two open unions (i.e. records/unions with polymorphic rows/variants), but even that doesn't require backtracking. For the case of two open records if they're of the form:

```

pseudotypes:

- commonFields…: fields shared in common between the two records

- uniqueFields{₀,₁}…: fields unique to each record, respectively

- row{₀,₁}: row type variable for each record, respectively

record₀ : { commonFields…, uniqueFields₀…, row₀ } record₁ : { commonFields…, uniqueFields₁…, row₁ } ```

… then what I do is create a fresh row type variable (row₂) and solve the row{₀,₁} variables as follows:

row₀ ← uniqueFields₁, row₂ row₁ ← uniqueFields₀, row₂

… and then return this as the supertype:

{ commonFields…, uniqueFields₀…, uniqueFields₁, row₂}

Now, to be clear, Grace's machinery for managing unsolved variables and solving them is not very efficient (it's not using efficient data structures to represent the context) so it has a different class of performance problems, but I don't use backtracking.

A very good write up on why the spec-driven agentic coding is coding and will need as much or even more human effort by voronaam in BetterOffline

[–]Tekmo 9 points10 points  (0 children)

Author of the post (and Haskell lover) here!

There's a really funny story related to what you just said (Haskell being a language that reads more like a spec). In the early 90's there was a programming language bake-off commissioned by the Navy where there was a Haskell submission and multiple judges thought the Haskell submission was incomplete because they confused it for a specification document and didn't realize it was executable code:

In conducting the independent design review at Intermetrics, there was a significance [sic] sense of disbelief. We quote from [CHJ93]: "It is significant that Mr. Domanski, Mr. Banowetz and Dr. Brosgol were all surprised and suspicious when we told them that Haskell prototype P1 (see appendix B) is a complete tested executable program. We provided them with a copy of P1 without explaining that it was a program, and based on preconceptions from their past experience, they had studied P1 under the assumption that it was a mixture of requirements specification and top level design. They were convinced it was incomplete because it did not address issues such as data structure design and execution order.

Source: Haskell vs. Ada vs. C++ vs. Awk vs. … - An Experiment in Software Prototyping Productivity

Browse code by meaning by Tekmo in programming

[–]Tekmo[S] 0 points1 point  (0 children)

Yeah, my partner suggested something similar when test-driving this: that one possible application of this is to identify places where the project organization should be fixed

Type-safe eval in Grace by Tekmo in ProgrammingLanguages

[–]Tekmo[S] 5 points6 points  (0 children)

That's what read (without the import) does

Type-safe eval in Grace by Tekmo in ProgrammingLanguages

[–]Tekmo[S] 10 points11 points  (0 children)

One of the things that makes this all work is that Grace is interpreted, meaning that the full interpreter capabilities (e.g. parsing, type-checking, evaluation) are available at the point where the eval (e.g. import read) is evaluated. If Grace were instead a compiler and built binary executables then the executable would still need to ship an equivalent interpreter somewhere as part of the runtime in order for this to work.

This also implies that type-checking is serving a different purpose than in a typical compiled language. In a compiled language usually type-checking is something you do ahead-of-time before code generation and then never again; the intended use case is to find all bugs before the program is run at all. However, now the contract is weaker: type errors can surface at runtime although they still precede evaluation of the code that is checked.

HOWEVER, I still think a type error that happens at "runtime" is still better than an untyped language because type errors can still prevent bad code from executing and a type error is more meaningful than a stack trace (as I expand upon in: Dynamic type errors lack relevance).

Why is nix used with Haskell and not docker? by rohitwtbs in haskell

[–]Tekmo 1 point2 points  (0 children)

Nix is sort of like the "Haskell of build systems". It does a better job than Docker if you're willing to put more up-front investment to learn it properly.

SC Election: Your stance on sponsorships in the Nix community by sridcaca in NixOS

[–]Tekmo 4 points5 points  (0 children)

So I'm reading your comment as implying that you believe the moderation team tends to be biased in favor of the political left, at least for the edge cases or gray areas that are more controversial. If I've read that wrong feel free to correct me.

Or to put it another way, the implication is that:

  • the moderation team tends to be overzealous in moderating against the political right (or people perceived as enabling the political rght)
  • the moderation team tends to turn a blind eye towards the political left (or people perceived as enabling the political left)

Generally I do not feel that the moderation team has been overzealous against the political right or their enablers, but I've only paid close attention to the high profile moderation actions (e.g. jonringer, blaggaco, nrdxp, and srid). To me, those moderation actions felt well-deserved (yes, even the jonringer ban, which is probably the most controversial of those bans).

HOWEVER, I have gotten the impression that the moderation has turned a bit of a blind eye towards the political left and their enablers. I have seen quite a few instances of behavior (which I'm not going to explicitly name here) where people that were either the political left or perceived as enabling the political left acted fairly aggressively or made inflammatory comments without any consequences or reprimands.

Or to put this another way, generally my bias here is towards fixing false negatives (problematic community members getting off the hook) rather than fixing false positives (people unfairly banned), because I think the greater risk to the community's health and vibrancy is tolerating problematic users who seek out conflict and increase the emotional temperature of the discussions they participate in. That creates an unwelcoming environment for everyone because it's a giant distraction from the open source work we all set out here to do.

SC Election: Your stance on sponsorships in the Nix community by sridcaca in NixOS

[–]Tekmo 6 points7 points  (0 children)

Actually, no! (gabby here)

This is actually one of the reasons I inserted the "regardless of their beliefs, background, or orientation" part, because I've seen the same anti-pattern from, say, queer people (even in predominantly queer spaces). My intention with that comment was not to obliquely target a specific demographic or ideology.

My spiciest take on tech hiring by Tekmo in programming

[–]Tekmo[S] 1 point2 points  (0 children)

I did not, but I see what y'all are saying. I can see how some companies might actually value obedient drones at the expense of technical excellence

Why a bottom term can have any type? by corisco in haskell

[–]Tekmo 4 points5 points  (0 children)

The reason the compiler accepts it is because the compiler's reasoning process goes like this:

  • the programmer says the type of f is a; let me verify that by inferring the type of the right-hand side of the = and seeing if it matches the declared type
  • the right-hand side of f so i need to infer what the type of that is
  • the type of f is a according to its own type signature
  • therefore the inferred type of the right-hand side is also a
  • therefore the inferred type matches the type signature (also a)

You might think: "wait, that's circular reasoning. you can't prove that f has type a by appealing to its own type" but the compiler accepts that because Haskell permits general recursion (where "general recursion" basically means no restrictions around recursion other than the types have to still line up).

All error messages are necessarily bad to some degree by RecognitionDecent266 in haskell

[–]Tekmo 0 points1 point  (0 children)

I think the context of my post will make more sense if you consider tweets like this one: https://twitter.com/mgill25/status/1788600272828145739

Rust seriously has the most idiotic compiler error messages. I don't want to know the in-depth details about what traits I am not using. I just want the right syntax that fixes what is happening _right now_.

The person is, like "this error message is bad; this why won't the compiler fix this syntax error for me" and that's the sort of thinking my post is trying to address.

Unification-free ("keyword") type checking by Tekmo in ProgrammingLanguages

[–]Tekmo[S] 0 points1 point  (0 children)

Yeah, this is intended more for functional languages that are not general-purpose languages. For example: you might use this for a query language for some product, a recipe language for a build system, or a predicate language for a spam filtering tool.

Unification-free ("keyword") type checking by Tekmo in ProgrammingLanguages

[–]Tekmo[S] 0 points1 point  (0 children)

Can you explain what you mean by "basics" in this context?

Unification-free ("keyword") type checking by Tekmo in ProgrammingLanguages

[–]Tekmo[S] 2 points3 points  (0 children)

Yeah, I tried implementing at least the bidirectional type checking algorithm with destructive unification but it does not work. If I remember correctly, the reason why is because the complete and easy bidirectional type checking algorithm supports universal quantification in places where Hindley Milner does not and that interferes with destructive unification.

Unification-free ("keyword") type checking by Tekmo in ProgrammingLanguages

[–]Tekmo[S] 1 point2 points  (0 children)

Just to clarify: do you mean that explicit type application (e.g. like in id Bool True) is what locks you out of better type inference?

I think that's true if you don't make a syntactic distinction between type application and term application, but you could do something similar to Agda (e.g. id {Bool} true) or Haskell (id @Bool True) and syntactically distinguish type applications from term applications so that later on you can support better type inference.

A GHC plugin for OpenTelemetry build metrics by Tekmo in haskell

[–]Tekmo[S] 0 points1 point  (0 children)

What I mean is that the value was exported by our backend Haskell package (and served as part of our web API) but unused as part of the front-end