How to Perform Web Server Performance Benchmarking

Performance benchmarking is a crucial practice for any web application in order to determine how well it handles expected traffic levels. By load testing your web servers, you can capacity plan effectively and optimize configurations for the best performance possible.

In this comprehensive guide, we will cover the most popular open-source tools to benchmark the performance of web servers like Apache and Nginx.

Why Web Server Benchmarking Matters

Here are some key reasons why you should benchmark your web servers:

Identify Performance Bottlenecks: Load tests help uncover any bottlenecks – whether it‘s maxing out CPU, memory, network I/O or application code itself.

Compare Configuration Options: You can benchmark different web server software, hardware specs, operating systems, application code versions etc.

Determine Scalability: Test how your web servers handle an increasing number of concurrent users and requests.

Capacity Planning: Use benchmark results to right-size your infrastructure when planning for growth.

Optimizing Performance: Tweak software and hardware configurations to handle higher loads after analyzing reports.

Evaluating Upgrades: Quantify the benefits of hardware upgrades, software updates, new platforms etc.

SLA Monitoring: Check if performance is meeting contractual expectations.

Simply put, you cannot improve what you don‘t measure. Benchmarking generates the data needed to boost web application performance and scale efficiently.

Using ApacheBench for Basic Benchmarking

ApacheBench (ab) is one of the most popular open-source benchmarking utilities available. It can generate significant load against any web server and produce detailed performance reports.

Let‘s set up ApacheBench to compare how Apache and Nginx handle loads for a sample web application.

Installing ApacheBench

The ab tool comes pre-installed on most Linux distributions. If not, use your system‘s package manager to install it quickly:

# Debian/Ubuntu 
$ apt install apache2-utils

# RHEL/CentOS
$ yum install httpd-tools  

Check that ab is now available and note the version:

$ ab -V
This is ApacheBench, Version 2.3 <$Revision: 1843412 $>
...

Now we‘re ready to start benchmarking!

Running ApacheBench Against Apache

For our first test, we will benchmark Apache 2.4 web server running on CentOS 7 with default configuration.

The test parameters are:

  • 5000 total requests
  • 500 concurrent users
  • Target URL on localhost

Here is the ab command and output:

$ ab -n 5000 -c 500 http://localhost/
This is ApacheBench, Version 2.3 <$Revision: 1843412 $>
...

Concurrency Level:      500
Time taken for tests:   13.450 seconds
Complete requests:      5000
Requests per second:    372.04 [#/sec] (mean)
Time per request:       1349.975 [ms] (mean) 
Transfer rate:          1833.15 [Kbytes/sec] received

...

The main results show that Apache handled 372 requests per second with 500 concurrent users. The average response time per request was ~1350 ms.

Let‘s compare this with Nginx next.

Benchmarking Nginx Web Server

We replaced Apache with the latest Nginx 1.14 on the same CentOS 7 server. The configuration is also default.

Running the same 5000 requests at 500 concurrency benchmark test gives us:

$ ab -n 5000 -c 500 http://localhost/
This is ApacheBench, Version 2.3 <$Revision: 1843412 $>
...

Concurrency Level:      500
Time taken for tests:   0.953 seconds
Complete requests:      5000
Requests per second:    5244.59 [#/sec] (mean)
Time per request:       95.316 [ms] (mean)
Transfer rate:          21791.78 [Kbytes/sec] received

...

We can instantly see that Nginx delivered 5245 requests per second – over 14x more than Apache! The average response time is just 95 ms compared to 1350 ms for Apache.

This shows that for a default setup serving static files, Nginx easily outperforms Apache. We could further tune each web server configuration to handle more load but our test demonstrates the benchmarking process quite well.

For other scenarios like proxing applications, serving dynamic content etc. the results might be much closer between Nginx and Apache based on factors like caching, number of workers, keepalive connections etc. This is why you need to benchmark for your specific use case.

Hopefully you now understand how easy it is to get started with ApacheBench for basic load generation and performance analysis! Next let‘s look at more advanced benchmarks.

Using JMeter for Advanced Benchmarking

Apache JMeter is a very powerful open source testing tool that goes way beyond basic load testing. Let‘s explore some key features:

Key Capabilities

  • Protocol Support: HTTP, HTTPS, SOAP, JDBC, LDAP, JMS, FTP and more
  • Load test any server type – web, database, cache, etc.
  • Create sophisticated test plans with scripting
  • Generate extensive metrics and customized reports
  • Highly scalable to thousands of concurrent users
  • Distributed testing using multiple JMeter instances
  • Command-line mode for CI/CD integration
  • Extensive plugin ecosystem available

As you can see, JMeter can handle incredibly complex test plans beyond basic HTTP loading.

Installing JMeter

Since JMeter is written in Java, you need to have Java 8 or higher installed. Then download the latest JMeter tarball and extract it. The directory structure would look like:

apache-jmeter-5.5
├── backups
├── bin
├── docs
├── extras
├── lib
├── licenses
└── printable_docs

Navigate to the bin folder where the main JMeter executable resides:

$ cd apache-jmeter-5.5/bin
$ ls
analyzer.cmd      jmeter-server.cmd  jmeter.sh       mirror-server.sh  shutdown.cmd
jmeter            jmeter-server      log_servers.sh  shutdown.sh
jmeter.bat        jmeter-server.sh   mirror-server   shutdown.sh

Now launch the JMeter GUI mode using:

$ sh jmeter  

The main JMeter desktop app should open up and you‘re ready start creating test plans.

JMeter Application Desktop

Creating JMeter Test Plan

The key elements of any test plan are:

  • Thread Group: Acts as a container for samplers and config elements. Defines number of users to simulate load.
  • Samplers: Requests like HTTP, JDBC, FTP etc. that are sent to servers under test.
  • Listeners: Output result views like Tables, Graphs, Trees etc.
  • Timers: Control pacing between samplers. E.g. constant timer, random timer.
  • Assertions: Validate response elements like codes, content, headers etc.
  • Pre-Processors/Post-Processors: Add setup and teardown steps to samplers.

Let‘s create a simple plan:

  • Thread Group with 100 users
  • HTTP Request samplers for home page
  • Aggregate Graph listener

The finished test plan would resemble:

JMeter Test Plan Example

Note that you can have multiple thread groups to handle different workflows and add many other controllers like If Controller, Loop Controller, While Controller etc. to create advanced flows.

Running JMeter Test

Once you have finalized the test plan, save it as a .jmx file.

Then switch to the command line jmeter tool to execute it in non-GUI mode which is recommended for actual load tests.

$ sh jmeter -n -t testplan.jmx -l testresults.jtl

This will trigger the test plan and save the results in .jtl format specified.

You can feed the JTL file back into the Listeners inside GUI mode later to display graphs and reports from the test run. Or use plugins like JMeter PluginsCMD to directly generate HTML reports.

JMeter HTML Reporting Dashboard

There are many other features in Apache JMeter for extremely advanced benchmarking that are beyond the scope here but hopefully this gives you a great start using it!

Additional Benchmarking Tools

Here are some other popular benchmarking tools to consider:

wrk – Simple HTTP benchmarking using a minimalist C program. Great for testing raw throughput of web servers.

k6 – Scriptable load testing tool in Go with support for advanced scenarios like TLS, HAR, streaming etc.

Siege – Long-running benchmarking utility to stress web servers with concurrent load across multiple URLs.

Bees with Machine Guns – Utility for load testing websites by provisioning swarm of EC2 instances as bees targeting sites.

Vegeta – HTTP load testing tool written in Go supporting timeouts, keep-alives, and more.

Gatling – Advanced open-source load testing framework based on Scala, Akka actors, and Netty.

The tools listed about should cover most benchmarking requirements but there are many other options out there both open-source and commercial.

Benchmark Analysis & Metrics

Now that you understand how to setup benchmarking using ApacheBench, JMeter etc. – how do you analyze the test results?

Key metrics and factors to evaluate:

  • Requests per Second: Primary indicator of web server performance. How many concurrent requests can be handled sustainably?

  • Response Times (Latency): Critical for user experience. Track averages, percentiles and outliers.

  • Error Percentages: 5xx/4xx status codes indicate problems during load. Track error budgets.

  • Throughput: Volume of content delivered helps right-size network and app capacity

  • Concurrency: Determine max parallel connections web server can handle.

  • Saturation Point: At what load levels does throughput plateau or errors spike?

  • Infrastructure Stats: Correlate web server KPIs with system resource utilization

Based on the numbers for above metrics you can derive the max throughput your systems can handle and scale up accordingly. The metrics will also help narrow down any component bottlenecks.

Use load tests to establish internal SLAs and set performance budgets for your web application infrastructure. Trend benchmark metrics over time after application and infrastructure changes to quantify improvements.

Best Practices for Web Server Benchmarking

Here are some key best practices to follow when benchmarking:

  • Use Realistic Workloads – Mimic production traffic patterns and payloads as much as possible during testing.

  • Test Often – Make benchmarking a regular practice instead of occasional. Parameterize tests for easy repetition.

  • Monitor Resources – Correlate web server metrics with system resource utilization for deeper insights.

  • Validate Config Changes – Every software upgrade, VM resize, cloud migration etc. should be benchmarked before rollout.

  • Automate Reports – Setup dashboards to monitor key benchmark metrics over builds and releases.

  • Size for Reality – Leave healthy headroom when provisioning infrastructure based on peak loads during testing.

Remember that the goal is not achieving vanity benchmark numbers but right-sizing production systems to sustain realistic traffic. Technical architectures and system configurations need to strike the right balance between cost, performance and scalability objectives.

Conclusion

We have covered a lot of ground when it comes benchmarking web server performance using open source tools like ApacheBench and JMeter. Follow the guide to:

  • Setup ApacheBench easily on Linux systems
  • Run basic benchmarking tests with concurrency
  • Compare Apache vs Nginx or different configurations
  • Analyze metrics like requests per second, response times etc.
  • Install Apache JMeter for advanced load testing
  • Create sophisticated test plans beyond basic HTTP
  • Integrate JMeter with CI/CD workflows
  • Explore containerized options for isolated testing
  • Monitor critical KPIs during testing to detect bottlenecks
  • Right size infrastructure and optimize web server performance based on benchmarking data

Hopefully this gives you a methodology and toolbox to tackle your web application benchmarking needs!

For additional references check out these articles: