Email : info@clicqa.com
+44 208 090 2404

Load Testing with 20,000 Concurrent Users

About Client:

An Australian based direct seller of health, beauty and homecare products, having with millions of customers across the globe.

Client Requirement : The client’s E-Commerce application is required to host thousands of users every day. Therefore, the client wants to go for the higher volume load tests.

Scalability : Application must be able to handle the load of 20,000 concurrent users.

Efficiency : Customer expectation includes the highly efficient application with prefixed response time and number of transactions per second.

Fail Over : Application should be able to behave as expected under specified load “20,000 concurrent users” and should have the Fail Over control.

Proposed Solution

We, at ClicQA, understood the client’s requirement and proposed 4 rounds of performance test cycles having minimum four tests each. We proposed load testing with an objective of 20,000-users load test, load will be scaled to a level of 20,000 users based on the findings of earliest tests.

Our Approach

We have performed load testing in three stages, the first is Test Preparation, the second is Test Execution and the third is Reporting and Tuning

Test Preparation

In the first stage – Test Preparation, based on the client requirement we have gathered test scenarios and based on the scenarios we have created test scripts.

At first, we have conducted dry tests and as per results we have gathered load generator statistics. Comprehending the statistics, our “Performance Test Engineering” has come across two options for load generation:

  • Using multiple load generators and a single controller
  • Using one controller which also hosts load generator

Technical Challenge in Setting up the Test Environment

There is a technical challenge due to JVM usage for Jmeter Load Generators, where our team has to use either multiple virtual machines for JVM or use single virtual machine to host a large lump of memory into JVM. Analyzing the dry tests, our team has observed that using remote machines for load generators caused failure in gathering load test reports in few scenarios. Therefore, our team has decided to use 32 GB JVM to support Jmeter Load Test for 20,000 concurrent users.

Test Execution

Our team has started the first test on given server and test environment setup that supports 20,000 users. Duration of test has been fixed to be 60 minutes with a ramp up time of 30 minutes and ramp down time of 30 minutes, and all 20,000 users stable for 30 minutes.

A swift 20 minutes into the test, our team has started seeing some anomalies and the response times were higher. Our team has handed over the report of the issue to the client’s development team, who were handling application performance management suite “New Relic”. Alongside escalating the issue to the development team, our team has also tried to decode the errors, if there are any, from test front. But all our parameters were matched and there were no issues.

Client’s development team has observed that out of four JVMs on web servers, only one of them has received the load. As we are using a load balancer to balance the load, our team has decided to go ahead with the test though we were unaware whether the physical load balancer or the software load balancer is causing the issue. Client’s development and DevOps team have worked on the issue and identified a few cliché settings on load balancer. After the client’s team has changed the setting, our team has taken the decision to abandon the ‑first test. There were no issues while executing the second test, all the load testing requirements were met, and SLA’s were guaranteed and tests passed. The third and fourth tests were conducted with more pressure at Ramp up/down times and results were studied.

Reporting and Tuning

Our team has done live reporting, a post execution report and a detail analysis report. The live reporting was provided to different teams at client site along with summary of the tests. Our team has suggested a few tuning measures and also made sure that all the performance issues are fixed at code level before going for higher volume load tests.

Outcome

  • We have helped client in mitigating the challenges and improving the application performance.
  • The client has observed 127% improvement in the performance of the application when compared to the previous.

Download PDF