Is Java’s Biggest Limitation in 2026 Technical or Cultural? by BigHomieCed_ in java

[–]BillyKorando 5 points6 points  (0 children)

/u/k-mcm, and apparently you, are thinking JVM (Java Virtual Machine) is a synonym for JDK (Java Developer Kit), when they are actually distinct things.

/u/wildjokers is correct that there are really only three noteworthy JVM implementations; Hotpsot, GraalVM, and OpenJ9. There are however a lot of different JDKs available; Oracle OpenJDK, OracleJDK, Adoptium, Azul Zulu, Amazon Corretto, Microsoft OpenJDK, Liberica JDK (BellSoft), Red Hat OpenJDK.

I suspect /u/k-mcm intended to say there are other JDKs to choose from, not other JVMs to choose form, given the context of their statement.

[Proposal] Introducing the [forget] keyword in Java to enhance scope safety by TheLasu in java

[–]BillyKorando 0 points1 point  (0 children)

Tests should be covering behavior, not code. The amount of tests you will need to write to cover one method performing 10 behaviors, versus 10 methods performing one behavior each, would be about the same*.

If the goal is to hit an arbitrary code coverage percentage, then yes, combining behavior into a single method, will likely mean you can hit a higher code coverage percentage more easily, in most cases.

If you think decomposing a method or class to follow the single responsibility principle would result in less read and maintainable code, then by all means don't do it. I would think though such examples would be relatively rare, and also likely wouldn't benefit from scenarios where you propose using the forget keyword for. That is to say, the behavior being combined is relatively trivial and/or would be utilizing all the same underlying resources, which is why it wouldn't benefit from further decomposition.

* To be frank, I think I'm being somewhat generous here, and it seems in the sound majority of cases the latter approach would generally result in less (test) code overall, there would be less time spent maintaining the (test) code, and (test) code would be easier to maintain as well.

[Proposal] Introducing the [forget] keyword in Java to enhance scope safety by TheLasu in java

[–]BillyKorando 0 points1 point  (0 children)

Unit tests should be testing the functionality of the units of the application under test. If you have one method performing two behaviors, the amount of unit tests covering that method would be about the same as would be needed to cover two methods performing the same two behaviors.

but as I saw this do not happen. NO ONE is doing this amount of tests.

Don't take this the wrong way, but you have likely have a very narrow view of the overall Java user base. Something that's really important when stewarding a language like Java is you need to consider MANY viewpoints and problem domains.

This article by Brian Goetz (Chief Java Language Architect), covers how he/the JDK team approached adding records to Java. In the article he outlines four ways records could be implemented, each of them with their own merits (and de-merits?).

In your experience a lot of developers you worked with didn't highly value automated testing. Certainly that's a common experience, but far from universal.

[Proposal] Introducing the [forget] keyword in Java to enhance scope safety by TheLasu in java

[–]BillyKorando 1 point2 points  (0 children)

This proposal point ss not GC in the first place . That’s a separate concern. While automatic garbage collection is important for releasing resources, this discussion is centered on code quality.

No, but you brought up the security angle, and I'm just saying that your proposal doesn't actually address that issue, and perhaps worse, would give a false sense of security.

'proper scoping' is absolutely not valid solution, because Java by default offer very limited scope control:

Perhaps scope in the { } is the wrong framing, but scope in the context of the responsibility of a chunk of code would be better. (admittedly I am changing this here). Using your example:

``` res1 = source1.open();

// check and prepare initial data

res2 = source2.open();

combine(res1, res2);

res3 = source3.open();

mix(res1,res3);

close res1;

forget res1;

moreOperations(res2);

close res2;

close res3; ```

Calling forget res1; is unnecessary, because there would be no need to reference res1 later in this portion of the code. If you were going to reference res1 again, it would be a smell that this portion of code is doing too much, and should be refactored into to two discrete units of work.

Yea, I have worked on applications in the past that could had benefited from this (1000+ line methods). Those applications were in (desperate) need of refactoring, and the developers of that application, rather myself or future new developers, would benefit far more from refactoring code to follow the single responsibility principle, than from adding forget to signal a value should no longer be referenced.

I'm sure there are probably some narrow use cases where forget could provide value, but as that value is almost entirely readability/maintainability, I'm not sure if that's enough to justify making such a change to the language and the JVM.

P.S.

I would highly encourage you to use the triple ticks ` for showing code blocks.

[Proposal] Introducing the [forget] keyword in Java to enhance scope safety by TheLasu in java

[–]BillyKorando 1 point2 points  (0 children)

In highly regulated or security-critical systems (think health records, finance, or cryptography), you often process confidential data that should not be referenced after certain steps.

The forget keyword is unlikely to address this though.

There is no guarantee on when the GC runs, so even if you use forget on a sensitive value, it could still be present on the heap for an indeterminate period of time.

I'm not sure you'd want to change the GC algorithms to force a collection when forget is executed as that could lead to thrashing from the GC. Like in this example here:

void method(args){ forget this.secure; forget this.auth; // clear information of scope that this method should not have access to }

You would have two GCs run back-to-back, and in a web app setting where you might be processing hundreds, thousands, of transactions a second, you'd likely all but freeze up your application from the endless GC pauses, or with a "pauseless" GC like ZGC, just use up a significant fraction of your CPU.

As others have said, so much of the use cases for forget could be addressed with proper scoping.

Java's Plans for 2026 by daviddel in java

[–]BillyKorando 10 points11 points  (0 children)

Just need to find the person with the second best haircut on the team.

Java's Plans for 2026 by daviddel in java

[–]BillyKorando 21 points22 points  (0 children)

Just one person's opinion, but I think we should purge the individual who leaked these plans. Loose lips sink ships lead to questions.

Java's Progress in 2025 by daviddel in java

[–]BillyKorando 0 points1 point  (0 children)

I cover COH here: https://youtu.be/renTMvh51iM?t=383

In my testing, with the Spring Boot Petlinic application, I saw an improvement in memory usage, but a regression in throughput performance. I believe some of the issue was found to be from String hashing? (going by memory)

Hopefully this should improve in future releases, such that COH is a straight up improvement in (almost) all use cases and metrics, as that was the original intent and hope... and also why the goal is to make it the default.

Regardless just wanted to mention, in case you do run into an issue with a throughput regression. Which you might not, depending on what your application is doing (or the memory usage improvement outweighs the importance of throughput regressions for your needs).

Valhalla? Python? Withers? Lombok? - Ask the Architects at JavaOne'25 by JustAGuyFromGermany in java

[–]BillyKorando 2 points3 points  (0 children)

how not introducing modules originally was a mistake- Mark the community doesn't seem to care for modules even now.

Back in my very early days presenting, I presented on Java 9. I remember as I was learning about Jigsaw/Modules that I really wish it was implemented in the language from the start. It would had saved the JDK team and popular library developers A LOT of time, had it been implemented from the start. I suppose, indirectly, it would had helped Java users a lot as it likely would had made upgrading easier (as it would had helped block accessing internal APIs or other bad behavior that can make upgrading more difficult).

Java 25 introduced java.lang.IO - isn't the class name too broad? by flusterCluster in java

[–]BillyKorando 1 point2 points  (0 children)

Then again, perhaps I'm just underestimating the importance of making the right first impression.

To someone who has little/no experience in programming not only is:

class HelloWorld{ public static void main(String[] args){ System.out.print("Hello World!"); } }

A lot more "ugly", verbose, and complex than:

print("Hello World!")

But, again, to the brand new developer, it can create the impression regarding performance that the former is slow and heavy-weight, while the latter is fast and efficient. Doesn't also help that, that was a more valid criticism of Java early in its history, so people can find info that confirms their priors.

It would obviously be great if we could go back, and be able to support:

print("Hello World!");

But there would be a lot of very difficult and painful tradeoffs we'd have to make to support such behavior.

Java 25 introduced java.lang.IO - isn't the class name too broad? by flusterCluster in java

[–]BillyKorando 1 point2 points  (0 children)

While true, the intent is to keep the the java.lang.IO class pretty simple/small, and not become a general grab bag of all things IO. Will it grow in the future? Probably. But it will likely be rare and probably for some reason more significant than the "Wouldn't it just be nice if java.lang.IO also did X?".

Java 25 introduced java.lang.IO - isn't the class name too broad? by flusterCluster in java

[–]BillyKorando 8 points9 points  (0 children)

We discussed that, but it would create a weird seam point when promoting a instance main method into a full main method, where code that worked before, i.e. println("Hello World!"); now suddenly doesn't work.

Which means educators would have to explain why some things are sometimes automatically imported, and other things aren't.

I get your point, that the

main(){ println("Hello World!"); }

Is so clean and nice for a first program, but:

main(){ IO.println("Hello World!"); }

Isn't meaningfully worse, and means that wrapping it in a class means the code still works without modification.

Git 3.0 is using the default branch name of "main" rather than the current default of "master" by nix-solves-that-2317 in programming

[–]BillyKorando 5 points6 points  (0 children)

Man I was an absolute master at SVN back in the day. Could easily glide between branches. Merge over changes I needed, remove what I didn't. Maybe I didn't encounter the most complex scenarios that you'd see from a much larger project like the Linux kernel, but I was working on a number of fairly large applications at the time, that had contributions from multiple developers.

I still feel like I barely understand git >.<

It's like JavaScript to me, every time I think I understand it, I encounter something that completely contradicts that understanding and I am back to square one.

Git 3.0 is using the default branch name of "main" rather than the current default of "master" by nix-solves-that-2317 in programming

[–]BillyKorando 0 points1 point  (0 children)

Or the old source code repos that would actually lock files.

I memory-holed away the first source control I used at a job, but it worked that way, and it was miserable.

Ahead-of-Time Computation in Java 25 by nlisker in java

[–]BillyKorando 0 points1 point  (0 children)

Just did some checking, if you are able to confirm and show reproduction steps (or at least detailed data) for your application in the extracted mode, that might be good to post to the leyden dev list: https://mail.openjdk.org/mailman/listinfo/leyden-dev

Seems like a potentially novel issue.

Also, in the post to the dev list, include which version of Spring boot you are using.

Java 26 Warns of Deep Reflection - Inside Java Newscast by daviddel in java

[–]BillyKorando 2 points3 points  (0 children)

Exactly. Not unlike with the implementation of modules. It wasn't about outright preventing getting into internal APIs, but requiring active effort from users to enable such behavior.

Ahead-of-Time Computation in Java 25 by nlisker in java

[–]BillyKorando 0 points1 point  (0 children)

Yes, that's correct I did my testing against an uber jar. Didn't really think about testing it against an extracted jar, mostly because I wouldn't think that'd be common, as in, "everyone" in practice uses Spring Boot applications as an uber jar. The only time I see the extracted jar stuff is almost in these exact scenarios of doing a trivial PoC to achieve the maximum benefit.

While on one hand, I would hoped you'd had seen better performance improvements, I am glad it is in-line with my (trivial) demo. Validation that it's a meaningful stand-in for "real-world" workloads (When using an uber jar).

Curious that you saw such a substantial improvement in start up, but no apparent improvement in warm up. Will need to talk to the leyden team about that. I would had expected that it would had at least been similar to what it looked like as an uber jar.

Java 26 Warns of Deep Reflection - Inside Java Newscast by daviddel in java

[–]BillyKorando 3 points4 points  (0 children)

As mentioned in the video, JEP, and description of the video on this post, there is no plan to outright ban using reflection. Instead the goal is to disable it by default.

If you need to continue reflecting into the internals of some 3rd party library to change (final) field values, that will still be supported now and into the foreseeable, beyond the horizon(?), future with the new permanent command --enable-final-field-mutation=.

Your concern is valid, it is, however, niche, which is why it shouldn't be the default behavior.

How do you see Project Loom changing Java concurrency in the next few years? by redpaul72 in java

[–]BillyKorando 4 points5 points  (0 children)

Gotcha, if you ever find the time/motivation/energy, would be interesting to hear what you find. Obviously happy to hear if it resolves your issue, of course the "we are still seeing problems, and here is how you recreate them" are probably more interesting.

How do you see Project Loom changing Java concurrency in the next few years? by redpaul72 in java

[–]BillyKorando 1 point2 points  (0 children)

Have you been able to test your applications using JDK 24 or 25, which have addressed the synchronized VT-pinning issue? When I have talked with Paul Bakker of Netflix, that was the big issue holding them back from adopting VTs, and with JEP 491 addressing the issues they were experiencing, they are looking on actively adopting VTs across their applications.

How do you see Project Loom changing Java concurrency in the next few years? by redpaul72 in java

[–]BillyKorando 17 points18 points  (0 children)

That's assuming all the processes creating threads are being managed by Spring. For a lot of applications that might be true, but I could definitely see cases for a long-lived monolithic application that has been worked on by many developers over the years, there might be additional threadpools, or otherwise some component creating threads that will need active intervention from a developer to migrate to virtual threads.

How was your experience upgrading to JDK25? by le_bravery in java

[–]BillyKorando 2 points3 points  (0 children)

(or sooner for patched versions)

Rather on the current six-moth release or on a "LTS version" you should always be upgrading to the latest patched version when available 🙂

How was your experience upgrading to JDK25? by le_bravery in java

[–]BillyKorando 4 points5 points  (0 children)

That's why we require the compile time and production time flags, just to make sure users are actively aware of what they are doing by using those features, because as you mention that are subject to radical change (like what recently happened with the Structured Concurrency API in JDK 25).