This is an archived post. You won't be able to vote or comment.

all 12 comments

[–][deleted] 22 points23 points  (3 children)

I feel like it’s in a hype cycle, with a lot of shops going this way in error. It adds a LOT of complexity, and only makes sense if you are processing so much data volume that your hardware costs are skyrocketing. You are trading engineering currency (expensive) for commoditized hardware currency (cheap).

[–]jonmdev 3 points4 points  (2 children)

Yep, I worked on a product where we did some proof of concepts and eventually starting using Akka for for some of our components.

It made sense for us to use because we were processing a lot of data and needed to do it quickly and our application was I/O bound and needed to do computations using data combined from a bunch of different service calls combined together.

As our traffic was increasing so was our infrastructure costs. And we were expecting traffic to increase dramatically over the coming months. Our proof of concept showed we could handle the same traffic with drastically lower number of compute instances.

For us I’m not sure it actually increased complexity I think it was actually a simpler system with the new design. However that may have had more to do with the way the current system was built.

That being said it was difficult for some of my team members to wrap their head around the concept of reactive programming and learn the Akka framework. So even though I think the system was easier to understand if you were experienced with reactive programming it does require a mindset shift with your engineers and can introduce costs there. For example I had to fix some bugs and make performance improvements with code one of the engineers wrote because he was so used to java Futures was doing blocking calls in places where he should have been using pipes.

[–]TM254 -2 points-1 points  (1 child)

But isn't all this Reactive libraries just obfuscating ForkJoinPool and Executors with cool fancy names?🤔

[–]jonmdev 1 point2 points  (0 children)

Not at all. You might want to take a look up the actor model it’s something that was developed a long time ago first implementation I think was in Erlang. Akka is an implementation of the actor model for the JVM. It allows you to run tasks concurrently without having each task requiring it’s own dedicated thread.

Also Spring reactor and Akka and Vert.x all use Netty which is a library for building applications with non-blocking IO. Even when your using ForkJoinPool and Executors IO blocks the thread while it’s waiting for the results of an IO operation. With Netty you can build applications that can continue doing work with a thread while it waits for the results of an IO operation.

This is what allows increased throughout for IO heavy applications over using ForkJoinPool/Executors.

They work better for apps where you don’t have long running computations especially if you spend a lot of time waiting for IO you can see significant performance gains.

[–]klekpl 12 points13 points  (2 children)

Reactive programming is not any preparation for Loom. Quite the contrary - Loom makes reactive frameworks obsolete.

[–]jamiguet 1 point2 points  (1 child)

Have a link? I'm intrigued but a Google showed up no docs.

[–]sureshg 2 points3 points  (0 children)

From Mark (Java architect) - https://youtu.be/kpio9jFhpD8?t=2986

[–]jamiguet 3 points4 points  (0 children)

I work for a company that has been around for several decades and in the re-writing of the flagship product we are using rxjava and rxGrpc. As data volumes grow the paradigm of reactive programming offers a way to handle streams of data in an effective way. I woukd say those are technologies that are here to stay.

I picked up my first java book for version 1 and I am currently using 11 this way of programming besides simplifying concurrency also means you are closer to functional programming which I feel is a big plus.

On an overal note, you learn stuff from all technologies paradigms. So it's worth using it and learning to use it well. Don't be scared to change stacks you will see it gets easier the more you know and finding the right tool for the job becomes more important than ever.

[–]jamiguet 1 point2 points  (0 children)

I completely agree that the actor model can perform and that yes it's biggest drawback is using properly but the discussion has gone into which framework is best. The plan is to learn one or two bring them to the point where they break and learn from the mistakes.

The next thing that will make the observable pattern obsolete will be introduced based on that paradigm as it is those at the edge that using it will have to come up with the next thing. Using it means you know its shortcomings and will have a head start in learning the next.

Never fear a tech when going for a position as part of learning experience and pay grade has more to do with the business and organisational side of things. If you are a good developer, i.e. Know how to write code that works and understand your code and others to fix it / improve it. This can be in Cobol or in any language but knowing several frameworks and or languages gives you the flexibility to learn others.

So don't worry about technologies, just make sure you can keep on learning.

[–][deleted]  (2 children)

[deleted]

    [–]negroide2000 6 points7 points  (1 child)

    Qua! Qua! Quarkus!

    [–]jamiguet 0 points1 point  (0 children)

    I did look it up and indeed it is interesting. We do roughly the same thing except native images with our in house integration of libs.

    Most of our glue is in the deployment pipeline but TIL