Skip to main content

Basic Web Performance Testing With JMeter and Gatling

Introduction

In this post I'll give a quick way to get some basic web performance metrics using both JMeter and Gatling.

JMeter is a well known, open source, Java based tool for performance testing. It has a lot of features, and can be a little confusing at first. Scripts (aka Test Plans), are XML documents, edited using the JMeter GUI.  There are lots of options, supports a wide variety of protocols, and produces some OK looking graphs and reports.

Gatling is a lesser known tool, but I really like it. It's a Scala based tool, with scripts written in a nice DSL. While the scripts require some basic Scala, they are fairly easy to understand and modify. The output is a nice looking, interactive, HTML page.

Metrics 

Below are the basic metrics gathered by both JMeter and Gatling. If you are just starting performance testing, these might be a good starting point.

Response Time – Difference between time when request was sent and time when response has been fully received

Latency – JMeter and Gatling define latency as the difference between time when request was sent and when response has started to be received, which includes the time the server spends processing the request.

Throughput – JMeter and Gatling define throughput as the number of requests per second. It is a measure of the load placed on the server during the test.

Installation

Setting up both JMeter and Gatling  is simply a matter of installing Java, and downloading the JMeter and Gatling binaries to your machine.

JMeter 2.11 can be downloaded here. Gatling 1.5.5 binaries can be downloaded here.

I've used the Oracle JDK 7 to run both JMeter and Gatling. The JDK can be downloaded here.

JMeter


Setting up a simple test plan in JMeter is pretty straightforward.

  1. Add Thread Group. Right Click 'Test Plan', Add 'Threads (Users)'->'Thread Group'. 
  2. Configure number of threads (users) on the Thread Group. (e.g. set to 10 to simulate 10 users)
  3. Configure the ramp up period of the users on the Thread Group (e.g. set to 5 to ramp up the given number of users over a 5 minute period)
  4. Configure the loop count on the Thread Group (e.g. set to 10 for each user to repeat the action 10 times). Check 'Forever' if you want each user to constantly repeat the action indefinitely (simulating the user constantly clicking the refresh button).
  5. Add HTTP Request Sampler. Right Click the new Thread Group, then click Add->Sampler->HTTP Request.
  6. Configure HTTP Request Server or IP. Leave out protocol, and point at the root. (e.g.blog.jerometerry.com)
  7. Configure HTTP Request Path. Set to '/' the HTTP Request Server or IP is the full URL you want to hit. Otherwise, append the relative path of the URL you want to hit (relative to the server)
  8. Add listeners to the Thread Group. For example, you might want tabular results, graph results, response time graph, or summary report. I'd recommend at least 'View Results in Table', along with 'Graph Results'. All listeners are added the same way. Right click the Thread Group, click Add->Listener then pick the listener you want to add.
  9. Save the test plan (from the File Menu).



When the test plan is complete, run it by pressing the green run button in the toolbar.

View Results in Table

Graph Results


Gatling


There is no GUI for editing Gatling scripts, other than a plain old text editor or your IDE of choice.

The first time using Gatling, you might want to use the Gatling Recorder.

Gatling Recorder


  1. Fire up the recorder from the Gatling\bin folder
  2. Configure the proxy server ports (defaults are HTTP 8000, HTTPS 8001)
  3. Configure the Java package that will contain the output Java Class containing the Gatling script (or leave blank for no package)
  4. Configure the class name, to give a descriptive name for the Gatling Simulation, or leave it as the default
  5. Click Start
  6. Configure your browser to use the Gatling proxy server
  7. Load up the page you want to record in your browser
  8. Once your done performing the actions you want the script to take, click the Stop and Save button on the recorder. 
  9. A new Gatling script will be created in the Gatling\user-files\simulations\ folder.

Sample Script

Here's a simple example of a Gatling script.It ramps up 10 users over 5 seconds, where each user loads http://blog.jerometerry.com 10 times each, for total of 100 page requests.

package jterry
import com.excilys.ebi.gatling.core.Predef._
import com.excilys.ebi.gatling.http.Predef._
import com.excilys.ebi.gatling.jdbc.Predef._
import com.excilys.ebi.gatling.http.Headers.Names._
import akka.util.duration._
import bootstrap._
import assertions._

class Blog extends Simulation {

 val httpConf = httpConfig
   .baseURL("http://blog.jerometerry.com")
   .userAgentHeader("Mozilla/5.0 (Windows NT 6.1; WOW64; rv:29.0) Gecko/20100101 Firefox/29.0")

 val scn = scenario("Jerome's Blog")
  .repeat(10) {
            exec(http("blog")
     .get("/")
   )
        }

 setUp(scn.users(10).ramp(5 seconds).protocolConfig(httpConf))
}


 Running Gatling Scripts


To run a Gatling script, you use the Gatling command in the Gatling\bin folder.  If you run Gatling without any options, it will prompt you to select the Simulation you want to run. However, if you know the package and class name, you can use the command "Gatling -s [package].[Class]".



After running a simulation, the HTML results are saved to Gatling\results\



Summary


This was a very basic guide to setting up and running JMeter and Gatling for the first time, assuming little or no previous experience with performance testing or these tools.

With a little bit of effort, it's not too much trouble to get some basic performance metrics on a single web page, including response time, latency and throughput.


Comments

  1. The blog gave me idea about the web performance testing my sincere thanks for sharing this post and please continue to share this kind of post
    Software Testing Training in Chennai

    ReplyDelete
  2. Nice and Valuable information you explained in this article I loved it more, It useful for me a lot. Bookmarked your site.
    Regards,
    Java Online Training

    ReplyDelete
  3. Interesting and informative article.. very useful to me.. thanks for sharing your wonderful ideas.. please keep on updating..


    Software Testing Training in chennai

    ReplyDelete
  4. Interesting blog post.This blog shows that you have a great future as a content writer.waiting for more updates...
    Digital Marketing Company in India

    ReplyDelete

Post a Comment

Popular posts from this blog

Generating Java Mixed Mode Flame Graphs

Overview I've seen Brendan Gregg's talk on generating mixed-mode flame graphs and I wanted to reproduce those flamegraphs for myself. Setting up the tools is a little bit of work, so I wanted to capture those steps. Check out the Java in Flames post on the Netflix blog for more information.

I've created github repo (github.com/jerometerry/perf)  that contains the scripts used to get this going, including a Vagrantfile, and JMeter Test Plan.

Here's a flame graph I generated while applying load (via JMeter) to the basic arithmetic Tomcat sample application. All the green stacks are Java code, red stacks are kernel code, and yellow stacks are C++ code. The big green pile on the right is all the Tomcat Java code that's being run.


Tools Here's the technologies I used (I'm writing this on a Mac).
VirtualBox 5.1.12Vagrant 1.9.1bento/ubuntu-16.04 (kernel 4.4.0-38)Tomcat 7.0.68JMeter 3.1OpenJDK 8 1.8.111linux-tools-4.4.0-38linux-tools-commonBrendan Gregg's Fla…

Multi Threaded NUnit Tests

Recently I needed to reproduce an Entity Framework deadlock issue. The test needed to run in NUnit, and involved firing off two separate threads. The trouble is that in NUnit, exceptions in threads terminate the parent thread without failing the test.

For example, here's a test that starts two threads: the first thread simply logs to the console, while the other thread turfs an exception. What I expected was that this test should fail. However, the test actually passes.

readonly ThreadStart[] delegates = { () => { Console.WriteLine("Nothing to see here"); }, () => { throw new InvalidOperationException("Blow up"); } }; [Test] public void SimpleMultiThreading() { var threads = delegates.Select(d => new Thread(d)).ToList(); foreach (var t in threads) { t.Start(); } foreach (var t in threads) { t.Join(); } }
Peter Provost posted an article that describes how to make this test fail. It works…