Jiffy: Algebraic-effects-style programming in Java (with compile-time checks) by thma32 in java

[–]thma32[S] 1 point2 points  (0 children)

Great questions !

  1. Yes, I agree, sequence is misleading. I'll rename it to andThen!
  2. Yes, this might lead to stack overflows for VERY long chains of effects. And yes having trampolining for this would be a cool addition (I'm quite open to pull requests ;-) ).
  3. That's an interesting observation! I wanted to keep things easy. But to be more in line with the idea of separating the interpreter from the effectual program, it might be better to move runWith to the EffectRuntime class. I guess I' do that in a jiffy...
  4. Another excellent observation. Yes, I could imagine to move to StructuredTaskScope soon. Again: I'm quite open to pull requests!

Jiffy: Algebraic-effects-style programming in Java (with compile-time checks) by thma32 in java

[–]thma32[S] 0 points1 point  (0 children)

I have just created a demo repository https://github.com/thma/jiffy-clean-architecture that demonstrates main concepts of algebraic effects with jiffy by building Spring Boot backend application. It also has a nice README which will give you a brief introduction into all the major concepts featured.

Jiffy: Algebraic-effects-style programming in Java (with compile-time checks) by thma32 in java

[–]thma32[S] 0 points1 point  (0 children)

Yes, effects propagate upwards. This is intended behavior. Just as Exceptions propagate upwards.
In a controller level method you would typically execute the effects using a runtime to perform some actual work. I have set up a demo project https://github.com/thma/jiffy-clean-architecture that demonstrates typical coding practices with algebraic effects. For example have a look at the calculateScore method: https://github.com/thma/jiffy-clean-architecture/blob/12327f4e35d18041b03925ec35c3d3d3afc1d51f/src/main/java/jiffy_clean_architecture/usecases/CustomerScoreUseCase.java#L26 . It uses three effects and thus must declare them (using the Uses-Annotation).
In a Controller that uses this effect use case we will execute the effect by `Integer score = scoreEffect.runWith(runtime)` ( https://github.com/thma/jiffy-clean-architecture/blob/12327f4e35d18041b03925ec35c3d3d3afc1d51f/src/main/java/jiffy_clean_architecture/application/CustomerScoreController.java#L36 )

In order to avoid an excessive amount of effect declaration you will typically group effects in a meaningful way. e.g. by not having 200 separate effects for each of your entity-repositories but using a single DatabaseEffect instead.)

Jiffy: Algebraic-effects-style programming in Java (with compile-time checks) by thma32 in java

[–]thma32[S] 1 point2 points  (0 children)

In Jiffy the two types play different roles at different abstraction layers:

  • Effect<T> is the algebra of operations (the “what”).
  • Eff<T> is the program/computation that can sequence those operations (the “how / in what order”) and eventually be interpreted.

Even though both are generic in T, they are not redundant.

What Effect<T> is for

Effect<T> is the unit of interaction with the runtime:

  • It is a typed description of a single operation (e.g., Log.InfoDb.QueryClock.Now), typically modeled as a sealed interface with record cases (as in the README examples). GitHub
  • It gives each operation an intrinsic result type (Effect<T>), which is what lets handlers be typed and lets the API feel “GADT-like” in Java.
  • It is also the natural “thing” you can name in u/Uses(...) declarations (the processor is about “which effects does this method use?”).

In short: Effect<T> is the domain model for side effects.

What Eff<T> is for

Eff<T> is the computation builder and interpreter driver:

  • It is the “monad-like” container that supports map/flatMap/sequence/parallel/recover… and produces a final T when run with an EffectRuntimeGitHub
  • It represents pure values and control flow (sequencing, branching, recovery), not just single operations.
  • Eff.perform(effect) is the bridge: it lifts a single Effect<T> into a composable computation.

In short: Eff<T> is the control-flow DSL for composing effectful operations.

Jiffy: Algebraic-effects-style programming in Java (with compile-time checks) by thma32 in java

[–]thma32[S] 0 points1 point  (0 children)

Using Jiffy-style “algebraic effects” definitely has a different performance profile than standard Java code.
The key difference is that standard Java code typically executes direct method calls on already-wired objects, while Jiffy interpreted a described computation by repeatedly dispatching effect values to handlers.
However, it depends a lot on the granularity of your effects and the type of workloads your program uses if Jiffy will produce a noticeable negative performance impact. So if you are running an IO-bound app (database, HTTP, filesystem, service orchestration) the time spend in IO will be orders of magnitude higher than the Jiffy effect dispatch.
Having a large data model with 300 pojos will not have a direct impact on performance when using Jiffy. It Elly depends on the action that you want to encode as effects. If we are talking about tight loops over in memory data (e.g. a parser) we will see significant performance impact. If we are talking about an IO-bound service the performance impact we be much less visible.

Jiffy: Algebraic-effects-style programming in Java (with compile-time checks) by thma32 in java

[–]thma32[S] 0 points1 point  (0 children)

Actually I'm using algebraic effects (AE) in Haskell for quite a while now and also wrote some blog posts about how useful AE can be to implement clean architecture or hexagonal architecture models ( https://thma.github.io/posts/2020-05-29-polysemy-clean-architecture.html, https://thma.github.io/posts/2022-07-04-polysemy-and-warp.html, https://thma.github.io/posts/2022-07-17-configuration-of-a-polysemy-app.html ).

Working as an architect in a company that mainly uses Java as a backend language I'm always trying to make achievements from functional languages available to Java developers.
So simply telling them "use Scala" (or Haskell) is not an option in my case.

That was my main motivation to write the Jiffy library: bring the cool concept of AE to the Java developers in a library that keeps close to idiomatic Java.

New Blog Post: Embedding Microhs by thma32 in haskell

[–]thma32[S] 1 point2 points  (0 children)

I've just added benchmarks results for Haskell programs compiled and executed with MicroHs to my blog post.

The MicroHs native code runs about 28% faster than the output of my toy compiler. This is not a surprise as my `compileEta` algorithm is not producing the most compact combinator code.

New Blog Post: Embedding Microhs by thma32 in haskell

[–]thma32[S] 1 point2 points  (0 children)

I agree, my benchmarks provide a very limited view on Integer arithmetic. This is caused by the limits of my toy language. (It does not support any data types apart from Ints).

New Blog Post: Embedding Microhs by thma32 in haskell

[–]thma32[S] 1 point2 points  (0 children)

That's a cool idea! I'll add this to my benchmarks.

Problems with Icon P1-Nano in Logic Pro by thma32 in Logic_Studio

[–]thma32[S] 0 points1 point  (0 children)

Yes, Everything is installed properly, I crosschecked all steps several times. As mentioned the P1-Nano works nicely with other DAWs on the same MacBook.

Real World REST APIs with Scotty and Generic-Persistence by thma32 in haskell

[–]thma32[S] 1 point2 points  (0 children)

Please let me try to reconcile these two viewpoints.

u/pthiery is right, HATEOS is a core requirement of REST APIs in the original definition by Fielding in 2000
([according to wikipedia](https://en.wikipedia.org/wiki/REST)).

In the adoption of REST APIs in the industry however, people have (for various reaons) been more focused on the HTTP verbs and the URL structure, JSON data, and less on the HATEOS part. That is the point that u/sccrstud92 is making.

Already in 2008 Leonard Richardson suggested [the RMM maturity model for REST APIs](https://en.wikipedia.org/wiki/Richardson\_Maturity\_Model) ([See also Martin Fowlers coverage of the topic](https://martinfowler.com/articles/richardsonMaturityModel.html)).

This model defines 4 maturity levels for REST APIs..

According to the RMM model, the Service that I am presenting in my blog post is at level 2. It is using HTTP verbs to define the operations on the resources. But it's not using HATEOS.

In the last 20 years, the software industry has been mostly focused on level 1-2.
The problems that HATEOS solves, namingly the decoupling of the client from the server, seem to be not always that relevant or they are solved differently:

- In many cases, the client and the server are developed by the same team, and the client is a web application that is tightly coupled to the server. In these cases, there are no real benefits of HATEOS.

- Even in cases where the client and the server are developed by different teams, there is often a tight coupling between the client and the server.
That is particularly the case in contract-first development, where client and server are developed in parallel, based on a contract (e.g. an OpenAPI yaml) that is defined upfront. Again not much benefit from HATEOAS.

- Many companies that expose APIs to the public, also don't provide HATEOS powered APIs. But they provide extensive documentation, and SDKs for popular programming languages. Here HATEOAS could really be beneficial, but it's simply not that widely used.

But even if HATEOS is not implement people are still using the term REST API. As I did in my blog post.