Database Performance Testing with Apache JMeter

Database Performance Testing with Apache JMeter

Overview

Database performance testing is used to identify performance issues before deploying database applications for end users. Database load testing is used to test the database applications for performance, reliability, and scalability using varying user load. The load testing involves simulating real-life user load for the target database applications and is used to determine the behavior of the database applications when multiple users hit the applications simultaneously.

Pre-Requisites

Use case

Let us perform database load testing to measure the performance of a database using Apache JMeter by configuring MySQL JDBC driver.

Building Database Test Plan

A test plan describes a series of steps to be executed by JMeter on running a database test. To construct a test plan, the following elements are needed:

  • Thread Group
  • JDBC Request
  • Summary Report

Adding Users

The first step involved in creating a JMeter Test Plan is to add a Thread Group element. The element provides details about the number of users to be simulated, frequency of requests to be sent by the users, and the number of requests to be sent by the users.

To add a Thread Group element, perform the following:

  • In the left pane, right click on Test Plan.
  • Select AddThreads (Users)Thread Group as shown in the below diagram:

select

  • Provide the Thread Group name as “JDBC Users”.
  • Click Add to modify the default properties as:
    • No. of Threads (users): 10
    • Ramp-Up Period (in seconds): 100
    • Loop Count: 10 as shown in the below diagram:

thread_properties

Note: The ramp-up period states the time taken to “ramp-up” to the full number of threads chosen.

As 10 threads are used in our use case and the ramp-up period is 100 seconds, JMeter will take 100 seconds to get all 10 threads up and running. Each thread will start 10 (100/10) seconds after the previous thread was begun. So, the query will be executed for 10 (threads)*10 (loop) = 100 times. Likewise, for 10 tables, the total number of samples are 100.

Adding JDBC Requests

To add a JDBC request, perform the following:

  • In the left pane, right click on Thread Group.
  • Select AddConfig ElementJDBC Connection Configuration.
  • Configure the following details:
    • Variable Name: myDatabase
      Note: This name needs to be unique as it is used by the JDBC Sampler to identify the configuration to be used
    • Database URL: jdbc:mysql://ipOfTheServer:3306/cloud
    • JDBC Driver class: com.mysql.jdbc.Driver
    • Username: username of database
    • Password: password for the database username as shown in the below diagram:

jdbc_connection_configuration

Adding Sampler

To add a sampler, perform the following:

  • In the left pane, right click on Thread Group.
  • Select AddSamplerJDBC Request.
  • Provide the following details:
    • Variable Name: ‘myDatabase’ (same as in the configuration element)
    • Enter SQL Query string field as shown in the below diagram:

jdbc_request

Adding Listener to View/Store Test results

A Listener is used to store test results of all JDBC requests in a file and to present the results.

To view the test results, perform the following:

  • In the left pane, right click on Thread Group.
  • Select AddListenerView Results Tree/Summary Report/Graph Results.
  • Save the test plan and click Run (Start or Ctrl + R) to run the test.

All the test results will be stored in the Listener.

Viewing Test results

Tree View

The results can be viewed in tree format as shown in the below diagram:

view_results_tree

Table View

The results can be viewed in table format as shown in the below diagram:

table_view

Graph View

The results can be viewed in graph format as shown in the below diagram:

graph_view

Response Time Graph View

The results can be viewed in graph format as shown in the below diagram:

response_time_graph_view

Performance Metrics

Different performance metrics viewable in JMeter are as follows:

Throughput

Hits/sec or total number of requests per unit of time (sec, mins, hr) sent to server during test.
endTime = lastSampleStartTime + lastSampleLoadTime
startTime = firstSampleStartTime
conversion = unit time conversion value
Throughput = Numrequests / ((endTime – startTime)*conversion)

In our use case, Throughput is 61.566/min. High value of Throughput indicates good performance.

Latency

Delay incurred in communicating a message. Lower value of Latency indicates the high volume of information being sent/received. In our use case, latency for the first thread is 24446.

Min/Max Load Time/Response Time/Sample Time

Difference between the request sent time and response received time. Response time is always greater than or equal to Latency.

In our use case, for all the samples, Response Time >= Latency

90% Line (90th Percentile)

Threshold value below which 90% of the samples fall. To calculate the 90th percentile value, sort the transaction instances by their value and remove the top 10% instances. The highest value left is the 90th percentile.

Similarly, the same calculation applies for 95% and 99% lines. After calculation, the values are 667, 666, and 664 ms, respectively.

Error

Total percentage of errors found in a sample request. 0.00% value indicates that all requests are completed successfully and query performance is good.

Standard Deviation/Deviation

Lower standard deviation value indicates more data consistency. Standard deviation should be less than or equal to half of the average time for a label. If the value is more than that, it indicates an invalid value. In our use case, it is 2881.

Minimum Time

Minimum time taken to send sample requests for this label. The total time equals to the minimum time across all samples. In our use case, it is 0 ms.

Maximum Time

Maximum time taken to send the sample requests for this label. The total time equals to the maximum time across all samples. In our use case, it is 23114 ms.

Average Time

Average response time taken for a request. The total average time is defined as the sum of average of total average of samples.

KB/sec

Measuring throughput rate in Kilobytes per second. In our use case, it is 0.

Samples

Total number of samples pushed to server. In our use case, it is 100.

Conclusion

In this blog, we discussed about measuring performance of the database using JMeter by adding real-life user load, connection details, and a sample MySQL query.

Different Listeners such as Graph Results, View Results Tree, Summary Report, and more can be added to view different metrics. These metrics are shared with Performance Tuning Engineer to identify performance bottlenecks. Similarly, more samplers can be added to test different databases at the same time.

References

649 Views 1 Views Today