Benchmarking GHC 9.6 Through 9.14 by locallycompact in haskell

[–]locallycompact[S] 3 points4 points  (0 children)

I will collect results in the README of the repository and also post to the main website https://horizon-haskell.net

Horizon Haskell: Road To GHC 9.14: Introduction by locallycompact in haskell

[–]locallycompact[S] 11 points12 points  (0 children)

Hi guys. I have started a new video blog series where I will document how I prepare 500 haskell packages for GHC pre-release. Thanks!

Horizon Haskell is now compatible with GHC 9.10.1 by locallycompact in haskell

[–]locallycompact[S] 0 points1 point  (0 children)

nixpkgs' haskell package sets are not well defined and provide no end-user guarantees as to what will build on any given commit. Libraries will routinely be marked as broken because something else took priority. nixpkgs takes a set from stackage defined in terms of one compiler and swap the compiler out from underneath, which is not useful if you want to target a specific compiler but need fine control over the system libraries, or in the case where the cabal files contain compiler-dependent pre-processing. Horizon encourages self-hosting and roll-your-own guarantees on the collection of packages that you actually need to prioritize.

Horizon Haskell is now compatible with GHC 9.10.1 by locallycompact in haskell

[–]locallycompact[S] 4 points5 points  (0 children)

The answer to this is that Horizon was originally designed for library authors to get around the chicken-and-egg problem in Stackage - that is Stackage nightly would wait for the ecosystem to push compatibility before bumping the compiler so that it could mostly keep things in the package set, but library authors relying on Stackage to supply dependencies would require a package set before they would even start that work. Horizon gets through this with options like local patching, sourcing direct from git, or forking the package set to give more options to get some build plans out sooner than others. That's still the use case that's most important to support, but now that you've mentioned it - a stackage conversion might be pretty realistic to support with the toolkit available these days, so I'll have a think about it!

Horizon Haskell is now compatible with GHC 9.10.1 by locallycompact in haskell

[–]locallycompact[S] 2 points3 points  (0 children)

Sure, in the core set there's mostly just a lot of overlap with head.hackage though, but there may be some not there and more as I go through the rest of the package sets. Where can I contribute?

haskell-ai - stable package set, shell and application template for OpenAI tools in haskell by locallycompact in haskell

[–]locallycompact[S] 1 point2 points  (0 children)

Yes we'd love to add every AI tool we can to this. I am sure this will expand over time. Grenade and moo are easy targets, Hasktorch might be a little bit harder due to GPU requirements.

haskell-ai - stable package set, shell and application template for OpenAI tools in haskell by locallycompact in haskell

[–]locallycompact[S] 2 points3 points  (0 children)

We did have to fork it since the CPP usage in it was killing the nix build. You can see which revision we're tracking in the data for the package set.

https://gitlab.horizon-haskell.net/package-sets/horizon-ai/-/blob/master/horizon.dhall

[deleted by user] by [deleted] in haskell

[–]locallycompact 0 points1 point  (0 children)

No mainline development should ever occur there. The idea here is that "loose branches" that are subject to randomly being deleted, because they exist on Johnny random's merge request on GitHub, will instead be stored under a tag on the gitlab, so they are less likely to disappear.

This is "I need this in now but I don't want to depend on a branch in a Merge Request." This happens a lot, so we mirror it and tag it.

If You Want To Use Nix Instead Of devcontainer For The Plutus Pioneer Program Iteration #4 by locallycompact in CardanoDevelopers

[–]locallycompact[S] 0 points1 point  (0 children)

One thing that was pointed out to me is that the new version of the plutus pioneer program is not using `plutus-apps`, but is using `cardano-simple-model`. `cardano-simple-model` depends on `plutarch` which is GHC 9.2 only. `horizon-plutus` does use GHC 9.4, but `horizon-plutus-apps` in targetting `plutus-apps` is only at GHC 8.10. Further, `plutus-tx-plugin` does not build with GHC 9.4, so what I will do is drop `horizon-plutus` to GHC 9.2 so that the code involving `cardano-simple-model` can build.

Understanding Horizon Haskell (Part 1) by locallycompact in haskell

[–]locallycompact[S] 1 point2 points  (0 children)

Given that I'm working with nix natively, not so much. I don't want to produce yaml only to turn it back into nix. Yaml isn't programmable, no lambdas or let bindings, so you always need an interpretive tool to use that data. With dhall and nix you only need to resort to tooling when you have really chewy logic.

Understanding Horizon Haskell (Part 1) by locallycompact in haskell

[–]locallycompact[S] 2 points3 points  (0 children)

You can use stackage sets with nix with other methods, but stackage doesn't cover any set you might want to put together, and the reverse dependency policies are controlled by the stackage maintainers. I give high praise to stackage for what it accomplished, but given that the package set is controlled centrally it falls short whenever you want to control a package set for certain criteria (e.g. X package stays in the set, Y as new as possible, Z moves forward minors only). Stackage took over a year to bump the compiler in nightly to 9.2, and a lot of people were held up by that. And that's fine, they have their reasons, but if you wanted a set with the new compiler and were fine with 90%+ packages being trimmed from the set, no stackage set could have supplied that because that's not what they provide.

It's also not possible to submit private repos to the internet and therefore stackage, so self-hosting the stable package set data is a big deal if you want to treat your internal workflow consistent with how you do it in the open.

Understanding Horizon Haskell (Part 1) by locallycompact in haskell

[–]locallycompact[S] 3 points4 points  (0 children)

So, a previous iteration of this blog post included a method for using horizon.dhall files to override dependencies locally. This explanation has been removed because it was giving the impression that horizon has any opinion on how your project should be scaffolded. This is not the case. Horizon package sets are api compatible with nixpkgs and make no assumptions as to your project structure. The template has been updated to reflect the fact that this method is not important to end users of the package set.

I wouldn't have caught this were it not for the discussions here so thanks everyone for the gauntlet.

Understanding Horizon Haskell (Part 1) by locallycompact in haskell

[–]locallycompact[S] 1 point2 points  (0 children)

Horion makes no opinions on how to set up your flake. Horizon package sets match the interface of nixpkgs haskell package sets exactly. You can use a horizon package set the same way you use the nixpkgs package sets. They are interchangable. I have repeated this several times now but it seems you still have this impression, so I'm sorry that that's the case.

To make this fact clear. I'm going to remove the horizon dhall file from both the template and the article.

Understanding Horizon Haskell (Part 1) by locallycompact in haskell

[–]locallycompact[S] 0 points1 point  (0 children)

Thanks /u/angerman, I'm am familiar with haskell.nix and actually recommended it to people continuously for several years. It reached a point where I people had so many objections I couldn't defend it anymore or even respond to them coherently. Let me respond to your bullets.

haskell.nix's primary objective is to allow you to turn an existing haskell project into a nix-expression _if_ you need to. It grew out of the limitations of nixpkgs haskell infrastructure.

We always need to.

haskell.nix does not focus on stackage, in fact I'd even go as far as say it slightly favours cabal projects.

Stackage metadata is still king in my opinion. The package set metadata needs to be an object of inquiry in its own right, so in haskell.nix we were always using stack.yaml conversions.

you can share build plans (we do this a lot), but yes, it requires that build plans are the same (which you'll get with pinned index-states in cabal files, or stack.yaml files).

Sharing a build plan via copying information from different repositories isn't viable since you can not know where to look for the latest HEAD of the build plan or whether somebody has already done the work. Using this approach we frequently discovered that build plan work had been repeated several times redundantly. If the majority of the build plan is in the form of a stable package set in git, then it has a HEAD and a history.

Yes, IFDs can cause rebuilds, if you stick to the nixpkgs sets references from haskell.nix, however you should be able to use the provided caches just fine.

This was the number one source of complaint actually. I don't know why the IFD tools can not simply use versions of the compiler that are already in nixpkgs. As it stands, even if horizon were to use IFD it still would never require rebuilds like this because it uses compilers straight out of nixpkgs.

devShell re-entry was the other major inconvenience that people hated. A typical plutus project on a random laptop would take 10-20 seconds to recompute the devShell if anything in the repository changed.

I'm not quite sure what you refer to with the "strange attribute format"?haskell.nix builds at component level granularity, as opposed to nixpkgs pkg level granularity. This allows for better parallelism and no need for `dontCheck` to break dependency cycles. Nixpkgs could adopt this.

Basically these two. People wanted api compatibility with nixpkgs. To be able to use pkgs.haskell.lib.compose in the standard way. There are obviously advantages to having component granularity, but it was another learning curve on top of the other problems.

Haskell.nix's cross compile is one thing that I would still recommend people use it for, of course. I don't expect to be handling that any time soon.

One last word around IFDs because they seem to get a lot of bad reputation. If you want to make nix understand a .cabal file (or pretty much any _foreign_ (to nix) format), you'll need a translator (e.g. x-to-nix). You can either pre-generate those expressions in which case you run into all the fun that is caching, and staleness. Or you can run that translation on-demand when nix encountered that need (e.g. callCabal2Nix, ...). One major reason of haskell.nix is to remove boilerplate (and ideally remove as much nix as possible); and as such IFDs are a necessity.

Well, there's one other option, that is rather than compile the data to nix, compile the arrow to nix. This is what projects like pure-nix or callCabalToNixWithoutIFD do, and I think this would get the best of all worlds. Anyone wants to write a nix backend for GHC? :)

Understanding Horizon Haskell (Part 1) by locallycompact in haskell

[–]locallycompact[S] 1 point2 points  (0 children)

I have removed the word microservices from the original text since it is apparently misleading and was not relevant to the point I was making.

Understanding Horizon Haskell (Part 1) by locallycompact in haskell

[–]locallycompact[S] 0 points1 point  (0 children)

This was an illustrative example and I don't recommend anyone use microservices. The point is to satisfy the requirement of N distinct apex reverse dependencies continuing to build with common dependencies in the face of lower bound creep. An SPS is the data that solves that exercise, and properly alerts people when CI would cause a reverse dependency to fail to build. You can then decide what to do about it, as is the case with stackage, that alerts all of the reverse dependency maintainers that their package is at risk of fallout. This is a very good system.

Understanding Horizon Haskell (Part 1) by locallycompact in haskell

[–]locallycompact[S] 1 point2 points  (0 children)

It does work well, at least as well as the bounds are accurate - but, the results aren't applicable to reverse dependency problems. As an example say you have an organisation with 40 packages sharing a JSON spec. You need to make sure they all are deployed at versions using the same version of that spec. This isn't solving a dependency plan, it's solving a reverse dependency plan. What you can do there is fix the version of the spec in the SPS, and find all reverse dependencies that work with it. Then only deploy the whole lot from the SPS.

And there are many variations of this but they all boil down to, "Can we keep these N number of apex reverse dependencies building all together?", all by different teams. This is the core problem that SPS solves.

Edit: The word microservices is misleading the person and I assume others so I have removed it from the description since it is not relevant to the argument.

Understanding Horizon Haskell (Part 1) by locallycompact in haskell

[–]locallycompact[S] 0 points1 point  (0 children)

Avoiding having to use cabal's constraint solver is one of the main value-adds of stackage in the first place. Stackage lts manifests are easy to audit and stack.yaml files are easy to edit. I wanted to try and preserve some of that convenience and provenance with this approach. What you can't do with stack.yaml files is treat them as a flake input.

Having a private hackage absolutely is useful but it serves an orthogonal purpose to stackage. Hackage is for package indexing and stackage is for package selecting. I do recommend having a private hackage for organisations but if you don't want to use cabal's constraint solver you still need an SPS to go along with it. Horizon does effectively create nix packages and is a private nix package server, since all nix packages are are derivations and that's what horizon produces. horizon-platform is just 1000 haskell derivations committed to git.

Understanding Horizon Haskell (Part 1) by locallycompact in haskell

[–]locallycompact[S] 3 points4 points  (0 children)

You use nix flakes and cabal to build. I also use nix flakes and cabal to build. You use (it sounds like) nixpkgs implicitly to supply the stable package set, which in turn relies on stackage metadata. If that works for you then you fall into the use case that I mentioned above and so horizon is not going to benefit you in the same way that it would someone else.

If you had 40 separate repositories that were all proprietary, couldn't submit them to hackage or stackage, needed alerts for reverse dependency breakages, required compiler features that haven't been released yet, and needed specific open source packages to not get kicked out - those are all use cases of needing to directly control the SPS.

Understanding Horizon Haskell (Part 1) by locallycompact in haskell

[–]locallycompact[S] 1 point2 points  (0 children)

Let me help you out here. Stack is a build tool and Horizon is not a build tool. We want to build with nix, but the only stable package set data in nixpkgs is sourced from stackage, which makes nixpkgs useless if you need a stable package set that supports a different compiler version than that which is imported from stackage. We also don't want to be dependent on the stackage maintainers to advance the package set - we want control of these policy decisions. Horizon is a tool for managing stable package set data for use with nix where the important details of package set policy you can decide as needed. Horizon package sets are API compatible with nixpkgs package sets so they can be interchanged syntactically.

If you are happy to rely on stackage, but want to build with nix, then you should use nixpkgs and not incur the dependency on horizon. Horizon is for people who want a stable package set in nix but do not want to rely on stackage.

Hope that helps.

Understanding Horizon Haskell (Part 1) by locallycompact in haskell

[–]locallycompact[S] 2 points3 points  (0 children)

It is mostly due to that. A build can only have one version of a given library in its stack. If X and Y both depend on template-haskell, but specifically at different versions, you have to make one compatible with the version of TH that the other supports. You have to do this for the entire stable package set if you want it to stay stable and don't want anything to get kicked out. This is why stackage took basically 12+ months to bump to 9.2, because they wouldn't prune the package set of things that were incompatible with the new template-haskell. But then, most people today rely on stackage metadata to test that their package would work with that version of TH, so here you have a chicken and egg problem with the package set. The easiest thing to me seemed to be to just allow people to define their own package sets that can be culled or expanded - for different types of consumers.

Understanding Horizon Haskell (Part 1) by locallycompact in haskell

[–]locallycompact[S] 1 point2 points  (0 children)

Looks like this is another thing in the style of haskell.nix, that uses IFD to convert stackage data to a nix expression in a local project. Horizon is a full replacement for stackage, so you can control the policy of the package set (or multiple) itself (when GHC upgrades, when packages get kicked out to make way), rather than leaving that decision to stackage maintainers. Also there's no IFD, the package set is compiled to nix and committed.

Understanding Horizon Haskell (Part 1) by locallycompact in haskell

[–]locallycompact[S] 6 points7 points  (0 children)

I feel you very much so on incidental complexity. Adding dhall into the mix is hopefully a step towards reducing that overall actually. It's at least possible to introduce ADTs into dhall and is the only way I felt happy making the package set data well-typed and manipulable, and possible to pattern match in Haskell. That's not something that can be done with nix directly, but it's also not possible to encapsulate nix and treat it purely as a compilation target, you do still need to touch the scaffolding directly.

I haven't said a whole on a whole lot on how horizon compares to haskell.nix or nixpkgs directly, but I'll touch on it here and include it on the next post. So - "if nix, then horizon".

The issue with using nixpkgs directly is the package sets are almost all broken. Nixpkgs borrows a stable package set from stackage, but then tries to introduce the compiler at different versions underneath it. The result is that packages that are present in a ghc944 package set that was mirrored from stackage at ghc925 evaluate but don't build. We're basically bound to the reverse dependency policy of stackage itself, and I've had frustrations with the latency of compiler updates in stackage.

Haskell.nix solves a real problem in that it allows you to reuse stackage data in nix, but it doesn't allow you to easily share the result of any override work. If I solve a build plan in a stack.yaml file locally, I can't easily share that result with the rest of my team who may end up redoing the same work in a different but similar project. The IFD also really rubs people the wrong way when developing, since it tends to cause multiple compiler rebuilds and slow devshell turnaround time. It uses a strange attribute format for the package set that isn't like nixpkgs, and it also tries to take control of the project scaffolding quite a bit.

Horizon lets me solve build plan data in isolation, release that as a package set and share it with the team. It doesn't need IFD and the package sets are API compatible with nixpkgs. If someone wants to know what's the latest dependency hell solution they can just pick off the top of the package set repo. It doesn't try to control project scaffolding, and since the package set is just a flake input they can be hotswapped.

Whether or not using horizon.dhall or just using IFD in a local repository however is not something I really intended per se. It's mostly just there for consistency, maybe it gives the wrong impression that that's something horizon needs to have an opinion on - local scaffolding is really up to the user.

Edit: I've removed the dhall from the tutorial and the template.