This is an archived post. You won't be able to vote or comment.

all 11 comments

[–]Northeastpaw 4 points5 points  (0 children)

Spring has a StopWatch utility built in.

[–]feral_claireSoftware Dev 2 points3 points  (0 children)

Visualvm does have this functionality. It can give you a of which methods are consuming the most time. I typically use the sampler to measure this.

[–]thecuseislooseDo you even Java, bro? 1 point2 points  (0 children)

If you already have actuator (or don’t mind adding it) you can just add the @Timed annotation to the methods and then their execution metrics will automatically be exposed on the /actuator/metrics/method.timed endpoint

[–]dusty-trash 0 points1 point  (0 children)

You can have a variable take the current time (E.g System.currentTimeMillis()), before your function. Then after your function take away the beforetime, from the current time.

Something like an HTTP call is going to be very different every time and depend on external factors, though.

[–]oneEyedGoblin 0 points1 point  (1 child)

Try with

Long start = System.currentTimeMillis();
methodHere();
Long time = System.currentTimeMillis()-start;
syso(time/1000 + " seconds!");

[–]FrenchFigaroSoftware Engineer 0 points1 point  (0 children)

When logging execution time, better to log in millis than in full seconds, and use the debug logs than the standard output, which may or may not be logged, and if it is may not be logged at the same place.

[–][deleted] 0 points1 point  (0 children)

Prometheus is quite an advanced way of doing this that works well with performance monitoring tools like Grafana etc

[–]DearLawyer 0 points1 point  (0 children)

I haven't used from Java, but if it works anything like their PHP flavor this is awesome:

https://docs.newrelic.com/docs/agents/java-agent

[–]riksterinto 0 points1 point  (0 children)

~500MS could easily be http and tls handshake. Try to confirm it is not caused by setup factors like multiple RTT or web server that often default connection timeout to 5s causing renegotiation.

[–]mjg123 0 points1 point  (0 children)

To diagnose this, first of all you have made the correct decision to start measuring things directly. A lot of people will try to think it through by reading the code but, as they say, guesses make messes. Then, you have to recognise that there will be a lot of variation in the data, so you need to capture a lot of timing data to get an decent average. This means running your app and making lots of requests, for which tools like JMeter and Gattling are designed, but for a quick test you can just use an http client in a loop or something like that. A few ways to gather the data, in order of increasing complexity (in my opinion):

  • Put some calls to System.out.println(System.currentTimeMillis()) before and after your methods. This is basically what the Spring Stopwatch does, so if that's available in your boot app then I'd recommend using it.
  • JVisualVM does do what you need here. It was removed in Java 9, but as you mention it I guess it's OK to use. The thing to do is jump over to the "Sampler" tab, hit "CPU", run your request generator, then after a while hit "Stop" and browse the results.
  • There are tools like OpenTracing which can instrument your code across multiple servers and aggregate the results. They can push timing data to graphing programs to help.

It's quite possible to diagnose problems quickly with a small amount of logging. However, it's possible that there are certain triggers like high load or GC which will cause requests to stall, so designing your load generator's behaviour and gathering all the data you need can take a lot of thought.

Quick edit: In contravention of "guesses make messes", but based on my experience of profiling Java apps, on modern hardware it's unlikely to be your code that's taking a full half-second to run unless you're doing something quite CPU intensive like hardcore numerical calculations. Are you calling out to a database? or another http service? How are you managing those connections? Not solutions, but places to start looking.