Stress Testing Apache Using ab

Sunday, November 30th, 2008 at 1:49 pm

If you’ve ever written a web-app you’ve probably wondered how well it will hold up once the world discovers your awesome service. Will it work if you get dugg? What happens if 200 people all try to access your site at once? This is where benchmarking can provide some useful numbers to give you an idea as to how your server will hold up.

Apache Benchmark can be a helpful tool to determine response times based on various traffic patterns. It lets you determine how many requests per second your server should be able to handle, and how long each visitor will have to wait to receive a response.

The syntax is pretty straight forward:

ab -c 10 -n 1000

(note the trailing / if you are wanting to test your main document). The -c tells ab to make 10 concurrent requests at a time and -n tells it to make 1000 requests total. This will create output similar to

This is ApacheBench, Version 2.3 <$Revision: 655654 $>
Copyright 1996 Adam Twiss, Zeus Technology Ltd,
Licensed to The Apache Software Foundation,
Benchmarking (be patient)
Completed 100 requests
Completed 200 requests
Completed 300 requests
Completed 400 requests
Completed 500 requests
Completed 600 requests
Completed 700 requests
Completed 800 requests
Completed 900 requests
Completed 1000 requests
Finished 1000 requests
Server Software:        Apache/2.2.9
Server Hostname:
Server Port:            80
Document Path:          /
Document Length:        240 bytes
Concurrency Level:      10
Time taken for tests:   20.824 seconds
Complete requests:      1000
Failed requests:        0
Write errors:           0
Non-2xx responses:      1000
Total transferred:      565000 bytes
HTML transferred:       240000 bytes
Requests per second:    48.02 [#/sec] (mean)
Time per request:       208.238 [ms] (mean)
Time per request:       20.824 [ms] (mean, across all concurrent requests)
Transfer rate:          26.50 [Kbytes/sec] received
Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:       59  120 376.0     68    3093
Processing:    61   80  44.0     70     466
Waiting:       61   80  44.0     70     466
Total:        121  200 377.8    144    3165
Percentage of the requests served within a certain time (ms)
50%    144
66%    151
75%    156
80%    162
90%    176
95%    203
98%    520
99%   3132
100%   3165 (longest request)

From this we can see that the server tested was able to hand approximately 48 requests per second and was able to process 1000 requests in just under 21 seconds. Whether or not these are acceptable values completely depends on the requirements of your project and application.

You are also able to send in post and cookie data (check the man pages for instructions on how to do that) if you would like to benchmark an authentication process or some form processing.

I’ve found ab to be  a useful tool to determine which pages on a site are taking too long to server / bogging down the server. It’s certainly not the end-all / be-all but can get you pointed in the right direction. After collecting some baseline data though ab, it may give you some ideas as to what code to optimize or where to implement caching on your server.

Remember, be kind and only stress test your own servers. Other people may not take kindly to you hammering their server just to see how they hold up.


Just a quick note about dynamic urls. You can use these if you put single quotes around the entire url. For example:

ab -c 5 -n 2000 ‘’

Tags: , ,

Comments are closed.