How to Conduct Load Testing for Serverless Applications

In general, load testing is the process of simulating several concurrent users accessing a software program in order to mimic the expected usage of the application.

Prior to deployment, load testing aims to eliminate performance bottlenecks and guarantee the stability and efficient operation of software applications.

It is non-functional testing, which doesn’t assert the functionality of the code per se, but instead evaluates how the application performs on a whole under heavy load.

But First, Why Load Testing?

If the code can be optimized to improve performance, why do load testing at all?

Although standard practices can be adopted to write the best possible code, the entire application is not just developer-written code, it also includes other components like server, database, caching mechanism, and so forth.

Load testing emphasizes testing the application as a whole under lots of traffic or stress. It is quite an irony when companies invest more capital in marketing that is responsible to bring in traffic than investing in testing that actually gives an insight into how effectively can the application handle that traffic.

There are several excellent load-testing frameworks out there of which JMeter load testing is predominantly well-known for providing standard capabilities and several testing techniques. Locust and Artillery are a few more to name.

Sneak Peek into Serverless Architectures

For those who are new to serverless environment, it basically allows you to host the backend of your application without setting up or configuring a server. Vaguely speaking, the backend is composed of a bunch of functions that correspond to different endpoints, so instead of configuring a server, we abstract such tunings and opt to use serverless architecture that takes away the overhead of server configuration and makes the backend deployment easier. AWS Lambda is a popular serverless service provider.

It also provides out-of-the-box features like scaling, code signing, and using container images.

Note that serverless does not mean that there are no servers involved, it is just that we’re provisioning a server without having to configure things manually.

Another important point that I’d like to highlight here is, when we load test a serverless backend, we are not testing if the service provider is able to scale or not, it is rather to check if the application that we’ve designed is intended to scale or not.

A Scenario that Suits Load Testing

Identifying the main business flows is necessary for testing how well your application will operate at scale. There is a good chance that all the users using a system need not be performing the same action.

It’s possible that one group of users is carrying out business process A while another group is carrying out another business process B. There may be a third, considerably smaller group of users performing a process different from the above two, let’s call it C.

When these business operations are carried out jointly, the impacts on your system could be very different from what they would be if they were carried out separately. Consequently, Business Process B may cause more resource contention when Business Process A is active than when each process is active alone.

Therefore, adding load to several business processes at once will provide us with a more accurate picture of how the system responds to stress.

The Process

Before load testing a serverless application, it is obvious to have a serverless backend up and running, the below section takes you through the steps one needs to follow and this is not an implementation guide.

The steps are as follows:

Assuming that your serverless backend is live, you can directly run the performance test. Stress testing can be used which is a load testing approach that puts your system through intense loads to test its performance at its maximum capacity. Stress testing checks memory leaks, slowness, security concerns, and data corruption in addition to load testing KPIs like response time and error rate.

JMeter can be used for this purpose, it is an open-source program made to evaluate functionality under load and gauge performance. Since we can load test each individual node in our system separately or all of the major flows in our system at once, it is very useful when we run serverless applications.

To perform load testing, a simple JMeter script can be created. Once the script has completed executing, you can move ahead with analyzing the results.

The results give important log information about the execution and the response at different loads. This helps the developer triage critical issues that should be handled immediately. Additionally, it also gives an insight into other warnings and errors which might’ve likely caused the application to break at some point. Aggregating and analyzing these results is a great way to make the application scalable and handle large loads.

Conclusion

It’s highly instructive to test our serverless applications under strong demand. We may run into scale problems during these testing and fix them before they affect real-world operations. Further, to test the applications effectively, there are different approaches and tools available which should be selected based on the requirements.