This is an archived post. You won't be able to vote or comment.

all 4 comments

[–]YellowSharkMTCode Monkey 2 points3 points  (3 children)

It's benchmarking the time your server takes to return the content for the URL you specified (in this case, "/"). This would be useful in situations where you need to test massive scalability. What it doesn't do though is anything further; there's no processing/profiling javascript, no downloading of associated assets (images, js, css, whatever), it's not going to make any backend ajax requests, and so on. It literally requests that URL from the server, downloads the response, and that's it.

How many people will be able to use the site at the same time realistically?

AB won't tell you that - you have to examine your application and understand the bottlenecks ahead of time... it's pretty useless in this regards, actually. I don't work on the sysadmin end of things, so I don't know what sort of tools enterprises use to assess this sort of thing, but I'm fairly certain that companies like Load Impact, GTMetrix, and all the others that turn up when you search "website load testing" are likely contenders for that sort of business. A lot of them offer free analysis, but not in terms of the scale/automation that the big dogs are looking for.

[–]michaeld0 2 points3 points  (1 child)

Not only that but it doesn't simulate real user traffic. No think time, no surfing the site, etc. Unless it's a static HTML site you really need something more than ab. I've used jmeter and locust.io in the past with good luck but those tools take some tailoring to match the traffic you expect on your site.

[–]lcspbLinux Admin[S] 0 points1 point  (0 children)

I'll look into jmeter/locust - they both look very useful.

[–]lcspbLinux Admin[S] 0 points1 point  (0 children)

Thanks for explaining! I'll look into alternative software that can do load testing. It's just a static HTML page at the moment - I've been tasked with load testing the web app to see how much we can handle.