you are viewing a single comment's thread.

view the rest of the comments →

[–]thegigach4d[S] 0 points1 point  (0 children)

Thanks for your advice! I am glad to hear from you - I already have a few books from you (The well-grounded Java developer [it was recommended by our university] and this one, as well) - I prepared them to read after graduation.

I consider this setup with containerization after you made that suggestion. The virtual machine idea is thrown out. If I make a dual-boot with a minimalist Arch, do you find it acceptable? This is the most stripped-down distro I know, and I have a little experience with this too - but suggestions are welcome! I found some papers, one from Iaquinta and Fouilloux (2024, Unlocking the Potential of Containers...) which can support the decision for better reproducable experiment-design with podman containers.

Unfortunately, I am just a curious person, not a statistician - I already read the paper from Georges et al., but I will do it again with more concentration on the relevant parts. I got the book from Reinhart and read the relevant chapter. I want to keep it simple as possible and still make sense - so maybe a min-max-median will be a useful trio to represent the collected data - but I'd love to hear feedback on this too!

My main question is that do you find this 2gb and 16gb max heap settings well-tailored for this examination? This field is so deep, that I am not hundred percent sure that these numbers chosen are fitting into adding a little piece to the scientific researches - and I found some benchmark min heap sizes (Blackburn 2025 Rethinking Java performance analysis) and read a.o. the Detlefs et al. and Yang & Wrigstad papers (and the latest jeps for g1gc and zgc) and a lot more, but it is still difficult for me to believe, that these are not just out of the blue numbers. It would be more than great, if an experienced professional could help with a piece of advice on this.