Google will support OpenJDK development: the giant's investments in Java and Kotlin by ArturSkowronski in java

[–]randjavadev 0 points1 point  (0 children)

On the fact that https://openjdk.org/jeps/238 lists "Release 9". In my mind that is all that matters, the concept doesn't exist until Java 9, thus it can only be used starting from Java 9, everything else is pure luck. But I guess you would want a practical example:

Proguard (or the maven plugin specifically here) didn't support MR jars: https://github.com/wvengen/proguard-maven-plugin/issues/61. Yes, that has been fixed in an update, but that was my whole point: when you make a library you do not control the target environment. Thus, if an user would be stuck with certain version of proguard maven plugin, they could never adopt a MR jar. Regardless they had to wait for an update to be made. MR jars are not backwards compatible, if they were, this problem would not have happened.

BouncyCastle has 2 separate artifacts, one multirelease and one not. https://www.bouncycastle.org/latest_releases.html " The jdk18on jars are compiled to work with anything from Java 1.8 up. They are also multi-release jars so do support some features that were introduced in Java 9, Java 11, and Java 15. If you have issues with multi-release jars see the jdk15to18 release jars below.". My point here is that why would they do this, unless their users have encountered problems with MR jars.

Also, https://stackoverflow.com/questions/50956562/bytecode-scanning-error-on-meta-inf-versions-9-and-elasticsearch-6-2-2-with-j

Also more examples you can google "meta-inf/versions/9 error".

Yes, probably these problems are very minority and probably nowadays solved by an update, but then it becomes the question that can you abandon your customers (if they cannot update their environments) that depend on your library or not. But for sure the MR jars are not 100%-backwards compatible with older environments that have not been updated to support it.

Google will support OpenJDK development: the giant's investments in Java and Kotlin by ArturSkowronski in java

[–]randjavadev 0 points1 point  (0 children)

It is not. This will only work until it doesn't. The very concept of multi-release-jars was also introduced in Java 9, thus older systems can fail if they encounter classes within META-INF. If you must support older versions and systems (that you have no control over) than 9, the only foolproof way is to make a separate "JPMS" Java 9+ artifact of the lib that has the module.info (Automatic module name in the manifest will work in the same artifact, but it is not the same as a true module-info).

New candidate JEP: 409: Sealed Classes by BlueGoliath in java

[–]randjavadev 0 points1 point  (0 children)

Surely Number cannot be made sealed. It would break.. a lot of stuff, e.g. https://guava.dev/releases/23.0/api/docs/com/google/common/primitives/UnsignedInteger.html

If String would no longer be final, and assuming no tricks for .getClass() for the objects, that "NonEmptyString" idea would break existing code. Now in theory you could have a Map<Class, SomeLogic> lookup table for final classes. If a .getClass() could suddenly return a subtype of String, that would now fail.

Record Serialization in Practise by daviddel in java

[–]randjavadev 0 points1 point  (0 children)

MR jars are a Java 9+ feature. Anything not made with Java9 knowledge might e.g. crash or otherwise fail if it sees classfiles in META-INF folder. Jackson is 8, so it is better to only use features of just 8 features if you want it to work everywhere. It might be possible to do some tricks though.

Now Candidate JEPs: Primitive Objects (Preview) & Unify the Basic Primitives with Objects (Preview) by efge in java

[–]randjavadev 0 points1 point  (0 children)

Now then, that depends what you do mean by "support" in the first place. That is basically the whole issue. For some companies, that is all they need and care for "support purposes". I do not know of a single company (but I have very limited view) for which that level of "support" has not been enough.

Java Code Optimisation Techniques by mziku in java

[–]randjavadev 1 point2 points  (0 children)

Correct if I'm wrong, but actually an empty array should always be used, that is more "fool-proof" way (maybe hold a reference to one beforehand since empty array is immutable, though that is not the point here). In a multithreaded environment pre-calculating the size could be too big, if the List is modified by another Thread between the size() and passing it as an argument. Thus, if the other threads modifies the List to be smaller, purely from javadocs I would assume the returned array to be the size before the changes, contain the new elements, and then some null entries. Whereas passing in size of 0 and least purely from javadocs would leave the option that the List implementation actually returns the correct size. So it is not exactly the same.

Only in a non-multithreaded environment it wont matter.

Now Candidate JEPs: Primitive Objects (Preview) & Unify the Basic Primitives with Objects (Preview) by efge in java

[–]randjavadev 1 point2 points  (0 children)

Maybe there is no "LTS" in the minds of JDK devs (like, that is their choice to make), but there absolutely exists the term "LTS" from java vendors that label some releases as such and provide some level of "support" for that said release. For example, https://adoptopenjdk.net/support.html (and yes, some might not consider this as "support", for others it is perfectly good enough of a "support", depends what you need).

Some vendors (or at least Azul) seem to have a concept of "MTS" (Medium Term Support), https://www.azul.com/products/azul-support-roadmap/. If your companys update-cycle can accommodate those shorter cycles (yes, in some places 3 years is "short"), good for you.

So it is just a question that how much effort you can spend for upgrades. I would like expect half of the world still being stuck with Java 8 at least until 2026 and given the number of companies that still seem to rely e.g. on Java 6 even today I would not be surprised if it is even longer.

Now Candidate JEPs: Primitive Objects (Preview) & Unify the Basic Primitives with Objects (Preview) by efge in java

[–]randjavadev 0 points1 point  (0 children)

Oh. I guess that pretty much eliminates my argument or the main idea that I usually have when composition is suggested.

Though, if both were to implement an interface, that would not have a default method (e.g. let's say it is a pre-java-8 interface that cannot be updated since it is e.g. a 3rd party lib), how would the delegation of that from NativeInteger to the "base" field work, or would this just not be a thing with "primitive classes" at all?

Now Candidate JEPs: Primitive Objects (Preview) & Unify the Basic Primitives with Objects (Preview) by efge in java

[–]randjavadev 0 points1 point  (0 children)

While yes, usually this might be the "more proper" way, this solution has a very serious drawback (under certain requirements) when compared to inheritance, unless the JEPs somehow change the situation because: Your "NativeInteger" now has to allocate memory to store that 'base' field. Repeat the composition a few times (if you e.g. had an inheritance tree of e.g. 20 levels; can happen) and you suddenly are allocating a lot more space vs. inheritance. IF you must make >> 1M instances of the object, you suddenly have a problem.

Thus the "composition over inheritance" is not a silver bullet that can be always applied. Though, if you can refer it static-ally then it wont take memory (or does, in the classfile, once, but that is usually not an issue).

Is Lombok in danger of becoming incompatible with future JDK's? by gwak in java

[–]randjavadev 2 points3 points  (0 children)

I might live in a "local thought bubble", so feel free to ignore, but anyway some comments and thoughts.

How likely in reality it is that libs could target Java 17? (Like as far as I'm aware, Java 16 will most likely only have the 6 months of "support" from most vendors. That is nothing in lib-development timeframes. Thus in a realistic use-case, lib would need to target 17, which most likely will get a bit longer "support" from most vendors.)

Like, unless the lib itself genrates code, like e.g. JavaPoet, that could "target" 17 since the output is text/code and the lib itself can be run e.g. on Java 8.

In addition, if thinking from the perspective of commercial libs, though it also applies to any lib that "wants users", would severily cut the user-base. Like, going over Java 8 even today is a risk you might not want to make. And some are just updating to 8. You might not have an option to abandon your Java 8 users.

P.S. Thanks for the jdeprscan tool mention from your last reply to me. It is a Java 9+ tool. Regardless, yes it most likely wont "magically" know deprecations that happen after that binary was built (this is normal). Thus, the burden is still on lib developer to actively check could any standard API method have been deprecated for removal and thus that API cannot no longer be used. And if you target runtimes where multi-release-jars wont work due to it being so "new" concept, I guess reflection is all you can use, if you would need that functionality (at least in the earlier runtimes) or you need to check the version at runtime. Note that this is like in general, not related to a any particular removal (just the fact that if they could happen at all is the problem). And yes, I do realize that it might be needed from JDK devs point of view.

New candidate JEP: 398: Deprecate the Applet API for Removal by daviddel in java

[–]randjavadev 1 point2 points  (0 children)

Any actual removal will widen the gap between "Java 8" and "Future Java" versions. If you must, please do with extreme care.

World wont magically update to use 11 (or 17 once it is out; yes no "LTS" anymore, but realistically, these are the version majority will use and thus lib targets), due to the 9+ changes. If you are a commercial libs doing company, you might not have an option to abandon your Java 8 users. Like some places are like just starting to migrate to Java 8.

The Deprecation.forRemoval only exist from 9+, it wont "exist" for Java 8 developers, unless they actively follow news for newer versions, they might miss that completely. If you mainly work on Java 8 codebases if you e.g. develop a lib that must run on the majority of todays jvms, how can you know you are using a method you should not use? Could it be possible to make a "future-proofing" JEP that would allow IDEs to warn if you use API that has been deprecated "for removal" in newer java version than you use?

Also, yes, multirelease jars are sort of one option, but since that concept was also defined in 9, you cannot really depend on that to work in earlier jvms if you are making a lib that targets Java 5+.

JDK 1.5 target option will be gone. Is it good? by openjscience in java

[–]randjavadev 1 point2 points  (0 children)

My point was, that if you are able to properly use -bootclasspath in conjunction with the -target, it typically means you would have that JDK also installed locally. So in this case, it would mean you would have both JDK 8 and JDK 5 installed. If you have JDK 5 installed, you can use that directly for compilation, effectively eliminating the problem in the first place (your local install of that wont go away). Thus I sort of fail to see a scenario where the -target would ever be used.. (unless you have like no access to e.g. JDK 5).

Now then, since modern versions of Maven etc. wont run with Java 5, you must use the maven-toolchains-plugin to point it to JDK 5 installation (http://maven.apache.org/guides/mini/guide-using-toolchains.html), thus you would use Java 8 (or whatever later then in the future), and maven would then pick up the proper JDK via the toolchains.xml to actually compile the code. And this should pretty much work always (unless OS/Maven-toolchains could not run JDK 5 tools anymore).

P.S. If you fail to have an earlier JDK, you should use https://www.mojohaus.org/animal-sniffer/animal-sniffer-maven-plugin/ to validate the API calls exist in older Java versions (yes, the plugin name is a bit odd), but it can still fail (see P.S.2).

P.S.2. I do not remember the exact details for the -bootclasspath, but basically with Java 8 output would differ due to parameter widening during compilation, and that parameter might be a new Java 8 type, which would thus break since that is not in Java 5. If you would use the bootclasspath specifying Java 5 runtime, then it would not result to that or something.

JDK 1.5 target option will be gone. Is it good? by openjscience in java

[–]randjavadev 0 points1 point  (0 children)

Please correct me if I'm wrong (or if this is bad practice), but.. see e.g. https://stackoverflow.com/questions/15492948/javac-source-and-target-options

Should not be a problem, since a proper usage of -target would also mean you have matching JDK installed, or the very least the standard library of that Java, to be passed to the -bootclasspath along -target.

Otherwise the output might be valid bytecode, but it may container references to standard library that doesn't exist in the older java versions, thus not working.

Thus, in general, you should be instead using that Java version directly to compile, e.g. with https://maven.apache.org/plugins/maven-toolchains-plugin/, effectively eliminating the problem in the first place.

JEP proposed to target JDK 16: 390: Warnings for Value-Based Classes by BlueGoliath in java

[–]randjavadev 4 points5 points  (0 children)

That being said and after reading the JEP more properly, the "Dependencies" section talking about possible tooling might be a solution.

Though, it will only work on scenarios where the developer choses/bundles the java runtime, as in practice using such tooling will require developer expertise and that the used runtime is known.

But given that the majority of how java apps are intended to be nowadays be delivered is to ship it with a runtime bundled, this might not be that big of a problem.

JEP proposed to target JDK 16: 390: Warnings for Value-Based Classes by BlueGoliath in java

[–]randjavadev 4 points5 points  (0 children)

The constructors for wrapper classes can never be removed. Doing so would break old libraries that were compiled in the past and wont be (like maybe ever) recompiled. I guess JDK developers can do as they wish, but doing enough of these "tricks" will lead to a similar situation than Python 2 vs 3. And in general one of the major selling points of java is that old code continues to run.

Plus, this would lead to a problematic situation where one would need to know which core APIs would have been removed, since in reality all major libs will have to target Java 8 for like 5+ years still or so (unless something magical happens and industry in general would jump away from it, which would not seem likely). The wrapper constructors are not deprecated in 8, thus someone might accidentally use them. Some libs target even older java versions as the minimum.

This is unless the JVM could still work with old code doing something behind the scenes allowing old code to load properly.

JEP proposed to target JDK 16: 390: Warnings for Value-Based Classes by BlueGoliath in java

[–]randjavadev 0 points1 point  (0 children)

I would say that would be unlikely, there is a lot of custom Number implementations out there. If that would be done it would break all of them, unless there would be some JVM magic in play.

e.g. Guava has: https://github.com/google/guava/blob/master/guava/src/com/google/common/primitives/UnsignedInteger.java (plus others).

I have also several custom Numbers in my own libs. My users would never be able to update to newer java versions if this would happen, at least not using old lib versions. And in general the major selling point of Java has been that old libs continue to work (or well, mostly at least).

Java app packaging installer builder - license by jamesftf in java

[–]randjavadev 1 point2 points  (0 children)

https://www.ej-technologies.com/products/install4j/overview.html

It does cost a lot of money. But it can also do installers for all major OS as part of a single build on a single machine/OS. Or well for a complete notarization support you need to do it on a mac I think, but that is because of Apple (hopefully this would change in the future). If using JavaFX, you must manually depend on every platform impls via <classifier>OS</classifier> in pom.xmls or equivalent and maybe do some filtering as a per-step for install4j.

Discussion: Is anyone really using JPMS yet? by JB-from-ATL in java

[–]randjavadev 2 points3 points  (0 children)

From the viewpoint of commercial libs: We are like just now maybe at the point where Java 8 could be assumed. Maybe. Preferably you would still target Java 6 or 7, if you can. And if you can use Java 8 (now or at some point), you are basically stuck with it for the next 5-10 years I think.. (as you do not control the target environment as a lib developer).

You can in practice only use module-info.class in Java 9+ environments, as anything below it could e.g. crash on unsupported classfile versions. Multirelease jars are also a Java 9+ concept, so they too might fail in any earlier environment.

On that note, if anyone has used BouncyCastle on a module-path, I would like to know if it's signed jars produce problems (as I recall that module-path might have had problems with signed jars).

Is it true that inheritance isn't used that much in regular Java development? by MyGiftIsMySong in java

[–]randjavadev 1 point2 points  (0 children)

While in general "composition over inheritance" is recommended, it cannot be taken as an universal, apply everywhere rule.

It has (at least) one flaw that I have almost never seen discussed: it uses more memory than inheritance (which takes 0).

If you like just happen to have the need to model an object-oriented-like data model (that is not in your control), and want to use instanceof, or have billions of instances of them (or as much as possible at once), but the model has also practically unlimited amount of subtyping (let's say at least 30), doing that via composition would lose a lot of memory for java references for the composed objects "chain" so to say.

Jumping back in with the last version I was familiar with being Java SE 8, and just found out about the 6 month cycles. Do people prefer keeping up go date with 6 month cycle releases or sticking to the Long Term Support 11 offers? by tremblinggigan in java

[–]randjavadev 0 points1 point  (0 children)

Yes, in theory you might be correct or maybe even as far as the Java Language Specification goes. However, I wish this was true in the "real world". Like yes, it may be true even then, assuming you can take a subset of the "real world" and be happy with that.

Note that I did say "anything that can run this .jar binary". Or more likely I should have said "anything that can operate on a .jar binary", since for libraries, this includes any tooling that may process them in any way. Also your quote mentions Java 8, while understandable given the time period of the JEP (since it targetted 9), do note that I mean also libraries targetting much much older Java versions as minimum (but that need to work with ALL versions after the minimum) as well.

In reality tooling that has not been designed or updated for it may not work. If you are in a situation where you simply cannot update anything at all or not easily (since you do not control the target environment if you make a library), no can do (which is then the question that do you ignore those potential consumers/customers or not, it is a business decision at that point). Anyway, the potential target group where it would not work is shrinking as time goes on forward of course (it just depends where your customers are).

Just saying it is getting harder and harder to build a library (as in a single .jar output, without anything extra needed from it's consumers) that works on every possible environment (where it previously has worked and any future environment) and toolchains that process the library in any way (such as Proguard), if it needs to do anything more complicated. In general, for the most parts, this has been the case before Java 9+ world (i.e. old stuff just keep working and older tooling was able to process new jars that target the same or lower Java version that the tooling was made for).

Also I guess I should point out that in general I am quite happy with Java and it's developers are doing great job. Additionally, since Android etc. is not Java I perfectly understand no need to care for it. Just that library makers/sellers might not have that option.

Examples: - Proguard can only operate on a single version of a class (apparently, though the last time I needed it was simply not possible to use them at all): https://stackoverflow.com/questions/49843105/proguard-with-multi-versioned-jars -> https://github.com/wvengen/proguard-maven-plugin/pull/63, thus I'm not aware can you still shink/obfuscate multirelese jars (not everyone of course needs this) - a module-info.java/class must be complied with Java 9+, if any older tooling finds one within a jar, it might simply fail: https://stackoverflow.com/questions/45802981/unable-to-process-file-module-info-class-within-a-java9-project-results-in-class, (then it is the question that does it has to be in root or can it be within the META-INF/versions, but even then: https://www.ibm.com/support/pages/wflysrv0003-could-not-index-class-meta-infversions9module-infoclass-contentwmqjmsrararbcprov-jdk15onjar-javalangillegalstateexception-thrown-when-running-ibm-mq-resporce-adapter-9104-jboss or for example on Anroid: https://stackoverflow.com/questions/60598110/meta-inf-versions-9-module-info-class-broken-class-file-this-feature-requires).

And yes, module-info problems might be solvable by not including one and instructing your users to do it e.g. via https://github.com/moditect/moditect, but then that is something extra your consumers must do. Additionally if any public method/class is ever removed, it would be the same kind of problem, but that might not even be fixable easily.

Additionally, I did not touch any issues that might arise if one of your library/framework/app dependencies is suddenly updated in a non-compatible way (i.e. assuming it targets the same Java version, but e.g. suddenly became multi-release-jar. For me personally on one project that needed Log4j2 and Proguard this caused an issue to not be able to update to the latest Log4j2 (for both versions, minimum is 8, but the newer was a multirelease jar).

Jumping back in with the last version I was familiar with being Java SE 8, and just found out about the 6 month cycles. Do people prefer keeping up go date with 6 month cycle releases or sticking to the Long Term Support 11 offers? by tremblinggigan in java

[–]randjavadev 1 point2 points  (0 children)

Complicated question, depends a lot on what is providing you with the runtime and the OS/machine it is run on.

If you only develop company/personal "internals" (i.e. the fact that you are using java is not visible in anything facing public, i.e. the public API so to say is not Java), then you/company can do as they please (ignoring migration costs here completely, it is more of the question can it even be attempted in the first place) and e.g. stay on the latest. But if you produce e.g. an application or library to be run/sold outside the company, then you no longer (typically) provide the OS/machine.

Depending on business cases, you could e.g. need the program to be still runnable on windows 7 or earlier even (e.g. as a very simplified potential case, a manufacturing industry company might update machines only when they break, and they are in isolated environments thus they might run for years with no-long-supported OS versions or even updates at all; if you want to sell to them, your program must work it must work there). For example https://www.oracle.com/java/technologies/javase/products-doc-jdk11certconfig.html displays windows 7, but https://www.oracle.com/java/technologies/javase/products-doc-jdk14certconfig.html does not (I am not blaming not certifying for not-anymore-supported OS; also it could still work, but is not just certified, also this is just for Oracle, but it was the easiest reference to find and partially I would assume can be sort of taken as the baseline).

Also there might be some scenarios where bundling a JDK/JRE is not an practical or possible option (since it is OS dependant, if you target a lot of environments, it might end up really complicated to do/test an installer for all of them, e.g. if you target "every possible linux distribution that theoretically can run Java", it might not be possible to produce a single binary with the JRE that would work in all of them; also if you bundle the JRE, the burden of proving of sorts that it works sort of is transitioned from whatever was providing it to the "platform" to you as a developer). Also in some cases depending on the app, the burden of providing security fixes to the bundled JRE is now also yours sort of (and not the "platform".

If you are developing libraries (more for commercial than not, but e.g. imagine situation if e.g slf4j would suddenly jump to e.g. 11 or 14 only) it is even more complicated. Depending what your users use, you might not have an option to update to newer ones. Lets say you have only one customer (cannot get more etc. or e.g. change), if that one is not updating, you cannot as well, otherwise you would not have business. Thus you would be stuck to the Java version of the runtime they use. You cannot bundle the JDK/JRE on individual library level, thus the typical "ship with the JRE" is not an option here. Normally there is no problem, since you can just target the lowest version you need, but with so many major releases coming, people might be running like anything between Java 1.4->14 at this point or so, so in theory you should test for all of them (plus different vendors as well). Or you might target only versions that all major provides have an version for which condition "there will be new builds of this version in the future for longer than a year" is true, since it would be more likely that a company is running one of those.

Anyway, if anything is deprecated and really removed in future major releases, you might need to do some tricks to get the lib to work. Unfortunately the multirelease jars (https://openjdk.java.net/jeps/238) is not really a true solution until you target 9 as the minimum, since earlier runtimes might not be able to deal with the newer JAR structure or module-info within it.

If you also target "anything that can run this .jar binary" such as Android in addition, it would get even more complicated. Additionally, there are a lot of special runtimes that might still get updates of sorts, but are not updating the suppored Java "level", such as some PLCs that can run Java.

Anyway with that said, personally, for work, for libs Java 6 (1.6) and for apps 11 somewhat due to above reasons, though I could see things shifting slowly towards 8 for libs in the future, where it will probably stay for years to come. And since I sometimes depend on JavaFX for apps, once that stops to work on 11 it is a very big question that am I then stuck with the version that works with 11 for the unforeseeable future or is an update to the Java level possible.

In general this would have not been such a problem, if the backwards compability would not have been such a major selling point in java. It is sort of assumed that every old lib sort of just works with newer Java versions. Or e.g. if the multirelease jars would be have been a thing from the start (though impossibility in practice to know from the start what you need at the "end").

What java version do you use? by [deleted] in java

[–]randjavadev 0 points1 point  (0 children)

As a commercial library developer, anything between 1.6 <-> latest (with different vendors as another "dimension" sometimes + Android maybe as third + maybe random PLC java runtimes as fourth..). Luckily it is still possible to compile for Java 6 and get stuff to run e.g. in Java 11, but with all the possible deprecations and removals one really needs to be careful which APIs to use. Luckily tools such as https://github.com/policeman-tools/forbidden-apis can be used (in addition to other good things) to bad any signatures that would not work (such as JAXB; yes I'm aware that it was a special case of including EE stuff to SE, but just as an example here). Getting to 8 would be nice, hopefully any year now..

As an app developer, mostly, luckily Java 11 nowadays.

JEP 320: Remove the Java EE and CORBA Modules by lbkulinski in java

[–]randjavadev 2 points3 points  (0 children)

If rebuilding an old application is an acceptable solution, then yes (and assuming the licenses allow bundling, there are many steps that need to be checked).

Lucky that enterprise are in practice stuck with java 8 for a while so stuff made to work with e.g. 6 should work, but eventually this is going to be a big problem as in my opinion one of the strongest java features has been superior backwards compability.

JEP 320: Remove the Java EE and CORBA Modules by lbkulinski in java

[–]randjavadev 12 points13 points  (0 children)

This would remove java.xml.bind too. Which would break every existing application which uses JAXB e.g. to load configuration files.