1. Introduction
Choosing the right tool for the job can be daunting. In this tutorial, we’ll simplify this by comparing three web application load testing tools – Apache JMeter, Gatling, and The Grinder–against a simple REST API.
2. Load Testing Tools
First, let’s quickly review some background on each.
2.1. Gatling
Gatling is a load testing tool that creates test scripts in Scala. Gatling’s recorder generates the Scala test scripts, a key feature for Gatling. Check out our Intro to Gatling tutorial for more information.
2.2. JMeter
JMeter is a load testing tool by Apache. It provides a nice GUI that we use can for configuration. A unique feature called logic controllers gives great flexibility to set up tests in the GUI.
Visit our Intro to JMeter tutorial for screenshots and more explanation.
2.3. The Grinder
And our final tool, The Grinder, provides a more programming-based scripting engine than the other two and uses Jython. However, The Grinder 3 does have functionality for recording scripts.
The Grinder also differs from the other two tools by allowing for console and agent processes. This functionality provides the ability for an agent process so that the load tests can scale up across multiple servers. It’s specifically advertised as a load test tool built for developers to find deadlocks and slowdowns.
3. Test Case Setup
Next, for our test, we need an API. Our API functionality includes:
- add/update a rewards record
- view one/all rewards record
- link a transaction to a customer rewards record
- view transactions for a customer rewards record
Our Scenario:
A store is having a nationwide sale with new and returning customers who need customer rewards accounts to get savings. The rewards API checks for customer rewards account by the customer id. If no rewards account exists, add it, then link to the transaction.
After this, we query the transactions.
3.1. Our REST API
Let’s get a quick highlight of the API by viewing some of the method stubs:
@PostMapping(path="/rewards/add") public @ResponseBody RewardsAccount addRewardsAcount(@RequestBody RewardsAccount body) @GetMapping(path="/rewards/find/{customerId}") public @ResponseBody Optional<RewardsAccount> findCustomer(@PathVariable Integer customerId) @PostMapping(path="/transactions/add") public @ResponseBody Transaction addTransaction(@RequestBody Transaction transaction) @GetMapping(path="/transactions/findAll/{rewardId}") public @ResponseBody Iterable<Transaction> findTransactions(@PathVariable Integer rewardId)
Note some of the relationships such as querying for transactions by the reward id and getting the rewards account by customer id. These relationships force some logic and some response parsing for our test scenario creation.
Luckily, our tools all handle it fairly well, some better than others.
3.2. Our Testing Plan
Next, we need test scripts.
To get a fair comparison, we’ll perform the same automation steps for each tool:
- Generate random customer account ids
- Post a transaction
- Parse the response for the random customer id and transaction id
- Query for a customer rewards account id with the customer id
- Parse the response for the rewards account id
- If no rewards account id exists then add one with a post
- Post the same initial transaction with updated rewards id using the transaction id
- Query for all transactions by rewards account id
Let’s take a closer look at Step 4 for each tool. And, make sure to check out the sample for all three completed scripts.
3.3. Gatling
For Gatling, familiarity with Scala adds a boon for developers since the Gatling API is robust and contains a lot of features.
Gatling’s API takes a builder DSL approach, as we can see in its step 4:
.exec(http("get_reward") .get("/rewards/find/${custId}") .check(jsonPath("$.id").saveAs("rwdId"))) .pause(1)
Of particular note is Gatling’s support for JSON Path when we need to read and verify an HTTP response. Here, we’ll pick up the reward id and save it to Gatling’s internal state. Notice the one-second pause, this necessary check prevents the next dependent request from failing. The check call and saveAs did not block subsequent exec requests.
Also, Gatling’s expression language makes for easier dynamic request body Strings:
.body(StringBody( """{ "customerRewardsId":"${rwdId}", "customerId":"${custId}", "transactionDate":"${txtDate}" }""")).asJson)
Lastly our configuration for this comparison. The 10 runs set as a repeat of the entire scenario, atOnceUsers method sets the threads/users:
val scn = scenario("RewardsScenario") .repeat(10) { ... } setUp( scn.inject(atOnceUsers(100)) ).protocols(httpProtocol)
The entire Scala script is viewable at our Github repo.
3.4. JMeter
JMeter generates an XML file after the GUI configuration. The file contains JMeter specific objects with set properties and their values, for example:
<HTTPSamplerProxy guiclass="HttpTestSampleGui" testclass="HTTPSamplerProxy" testname="Add Transaction" enabled="true">
<JSONPostProcessor guiclass="JSONPostProcessorGui" testclass="JSONPostProcessor" testname="Transaction Id Extractor" enabled="true">
Check out the testname attributes, they can be labeled as we recognize them matching the logical steps above. The ability to add children, variables and dependency steps gives JMeter flexibility as scripting provides. Furthermore, we even set the scope for our variables!
Our configuration for runs and users in JMeter uses ThreadGroups and a LoopController:
<stringProp name="LoopController.loops">10</stringProp>
<stringProp name="ThreadGroup.num_threads">100</stringProp>
View the entire jmx file as a reference. While possible, writing tests in XML as .jmx files do not make sense with a full-featured GUI.
3.5. The Grinder
Without the functional programming of Scala and GUI, our Jython script for The Grinder looks pretty basic. Add some system Java classes, and we have a lot fewer lines of code.
customerId = str(random.nextInt()); result = request1.POST("http://localhost:8080/transactions/add", "{"'"customerRewardsId"'":null,"'"customerId"'":"+ customerId + ","'"transactionDate"'":null}") txnId = parseJsonString(result.getText(), "id")
However, fewer lines of test setup code are balanced by the need for more string maintenance code such as parsing JSON strings. Also, the HTTPRequest API is slim on functionality.
With The Grinder, we define threads, processes and runs values in an external properties file:
grinder.threads = 100 grinder.processes = 1 grinder.runs = 10
Our full Jython script for The Grinder will look like this.
4. Test Runs
4.1. Test Execution
All three tools recommend using the command line for large load tests.
Gatling requires only that we have JAVA_HOME and GATLING_HOME set. To execute Gatling we use:
GATLING_HOME\bin\gatling.bat
JMeter needs a parameter to disable the GUI for the test as prompted when starting the GUI for configuration:
jmeter-n.cmd -t -l TestPlan.jmx -e -o [path to output folder]
Like Gatling, The Grinder requires that we set JAVA_HOME and GRINDERPATH. However, it needs a couple more properties, too:
set GRINDERPROPERTIES="%GRINDERPATH%\grinder.properties" set CLASSPATH="%GRINDERPATH%\lib\grinder.jar";%CLASSPATH%
As mentioned above, we provide a grinder.properties file for additional configuration such as threads, runs, processes, and console hosts.
Finally, we bootstrap the console and agents with:
java -classpath %CLASSPATH% net.grinder.Console
java -classpath %CLASSPATH% net.grinder.Grinder %GRINDERPROPERTIES%
4.2. Test Results
Each of the tests ran ten runs with 100 users/threads. Let’s unpack some of the highlights:
Successful Requests | Errors | Total Test Time (s) | Average Response Time (ms) | Peak Throughput | |
Gatling | 4100 Requests | 100 (soft)* | 31 | 64.5 | 132 req/s |
JMeter | 4135 Requests | 0 | 35 | 81 | 1080 req/s |
The Grinder | 5000 Requests | 0 | 5.57 | 65.72 | 1284 req/s |
A glance shows The Grinder being 6x faster than the other two tools for total test. The other two clocked similarly at ~30 seconds. The same step between The Grinder and Gatling, creating 100 threads, took 1544ms and 16ms respectively.
And while The Grinder is high-speed, it comes at the cost of additional development time and less diversity of output data.
Additional note Gatling has 100 “soft” failures because of the logic check for a reward Id that doesn’t exist. The other tools do not count the if conditional failure as an error. This limitation is baked into the Gatling API.
5. Summary
Now it’s time to take an overall look at each of the load testing tools.
Gatling | JMeter | The Grinder | |
Project and Community | 9 | 9 | 6 |
Performance | 7 | 7 | 10 |
Scriptability/API | 7 | 9 | 8 |
UI | 8 | 8 | 5 |
Reports | 9 | 7 | 6 |
Integration | 7 | 9 | 7 |
Summary | 7.8 | 8.2 | 7 |
Gatling:
- Solid, polished load testing tool that outputs beautiful reports with Scala scripting
- Open Source and Enterprise support levels for the product
JMeter:
- Robust API (through GUI) for test script development with no coding required
- Apache Foundation Support and great integration with Maven
The Grinder:
- Fast performance load testing tool for developers using Jython
- Cross-server scalability provides even more potential for large tests
Simply put, if speed and scalability is a need, then use The Grinder.
If great looking interactive graphs help show a performance gain to argue for a change, then use Gatling.
JMeter is the tool for complicated business logic or an integration layer with many message types. As part of the Apache Software Foundation, JMeter provides a mature product and a large community.
6. Conclusion
In conclusion, we see that the tools have comparable functionalities in some area while shining in others. The right tool for the right job is colloquial wisdom that works in software development.
Finally, the API and scripts can be found on Github.