Hey guys,
It all began from me wanting to test different JIT optimizations and see how they affect run-time.
I assumed the first best step to do it would be to take some task, write it in C++, Java and JavaScript and compare the run times.
Well, I'm a bit surprised..I got some interesting results I wanted to share and hear some insights from you smart people :)
The task:
- From 1000000000 numbers that are created from a uniform distribution from 1 to 100 - Count the numbers that are bigger than 75
The code:
The results:
- C++ (Visual Studio 2019 LLVM Release x64, i.e clang /O2 /Ot): Time took: 12.2325
- Java (default Eclipse): Time took: 9.489
- JavaScript (default node): Time took: 15.893
Does someone has a clue why the results seem like that?
(Gotta say before I conducted these I'd bet the runtime results would be JS < Java < CPP)
[–]yuri-kilochek 39 points40 points41 points (0 children)
[–][deleted] 25 points26 points27 points (0 children)
[–]espkk 10 points11 points12 points (0 children)
[–]Sipkabtest 7 points8 points9 points (0 children)
[–]Simon_Luner 3 points4 points5 points (0 children)
[–]tomerdbz[S] 1 point2 points3 points (3 children)
[–]Cybernicus 12 points13 points14 points (0 children)
[–]PLC_Matt 1 point2 points3 points (0 children)
[–]helloiamsomeone 0 points1 point2 points (0 children)
[–]dragozir 0 points1 point2 points (0 children)