Imagine you are working on a big monolithic application that has hundreds of features and no test cases written on it. And you change something in one module. Will you get the confidence that this change will not impact the currently running application? For sure your will some goosebumps.
To avoid these situations we write test cases and integrate them into CI/CD pipeline. So that for every change the pipeline will run all the test cases to check whether the change will not break anything. And to rescue us in these situations Pytest or any other(as if they exists) testing toolkit will come into the picture.
But does it really solve all the issues?
Your test cases don’t actually replicate the same environment that your production is having. Think of it this way your test cases are actually checking whether the feature works or not. It is not checking the feature under stress usage.
Generally, the test cases are meant for only one process. But in production, there are hundreds or even thousands of concurrent users are using the same feature.
So the question is, how will you replicate the same environment as production and test features of that.
Here come the load testing tools into the picture. There are many tools in the market for the same but I am going to explain Locust for that as it’s written in python and you can write scripts in Python. And most importantly it is open source and you can find the full documentation here. Enough talks, let’s dive into the topic
We would need some applications for this demo so let’s create some super heavy applications like the one below
Run your application with gunicorn with 4 workers at least to do load testing
To do that install gunicorn first
Run it via the below command
Generally, test cases are written inside the repository itself. But load testing as it’s a kind of functional testing you can put in a totally separate repository.
Install the library using our favourite package manager
The entry point of the locust load testing toolkit is locustfile. So let’s create a simple test for our application in locustfile
Here we wrote three tests in one class for our super heavy and complicated application. Because our application is only having /hello, /world and /square API in it. Once you are done with it then to run it just fire the locust command
Just open the URL that is there in the terminal and there you can point to the application URL, concurrent request you want to fire and the ramp-up period of them.
Let’s say my application is used to have 1000 concurrent users on production so I want to load test my application with concurrent 1000 users. Then click start swarming
Then you can see the locust has swarmed your application.
Then you can also check out your application throughput using that. But also the spawn rate, requests per second and many other metrics. You can also check which APIs are failing in heavy loads and what are the exceptions to that.
But’s generally we don’t want these fancy UIs when we want to integrate our pretty locust in CI/CD. Then this locust has command-line flags too, to specify the URL etc.
You can fire a command like this
This will start the locust for 3 minutes and it will spawn 100 users and the application host would be http://localhost:5000
The output will be like this:
More Complicated Example
Honestly, the application that we have created was in no way near to any production application. Most commonly the production applications have some kind of user management where some APIs were usually behind some authentication mechanism. Let’s integrate JWT into our application. As this blog is not about JWT so I will just use some basic library for that and also as a matter of fact I love this library.
Now add one /login endpoint to our application and put the other square endpoint behind jwt.
Just in case, below are the curl command for login and for square API now because first, we have to log in obtain an access token then pass the access token to square API.
Let’s come back to locustfile
Modify your locustfile so that it will first log in to the application, acquire the access token and use that access token in some of the APIs
You must have noticed that I have added a class variable
That comes very handy, if you want to reproduce a real-life scenario like you don’t want to bombard your application with requests one after another. Generally, users use the application by taking some time in between the requests. So this command will wait for around 1 to 5 seconds between any consecutive test case.
You have also noticed
@task(3) task. the task is the micro threads that will run by the locust program(test case). So assigning 3 to one task means we are assigning weight as 3 to that task. So the likelihood of locust picking up that task is 3 times more than of the other task. So in a real-life scenario, some APIs like dashboard API or home API or profile API are fired more in general. That you can replicate using this configuration.
We have also added the on_start function that will just task the token from the login API and assign that to the instance variable so that the token can be used by other test cases.
Use of FastHttpUser
Instead of the default HTTP client that is HttpUser you can use FastHttpUser. As the name suggests it is fast when you are bombarding your application with a large number of users.
FastHttpUser uses a different HTTP client (geventhttpclient) compared to HttpUser (python-requests).
It’s significantly faster, but not as capable.
Categorize test cases into classes
Categorize your test into different classes into separate modules. And assign specific weights to the whole class to replicate a real-time scenario. By that, you can also run a specific class of test cases also
As you might already be knowing that the locust is running on Flask-2 and using gevent to create concurrent requests. So the debugging is quite difficult with that. To solve this they have this
run_single_user a function that will not create any thread and to run this you just have to simply run the python application using the python command
This was the gist of the locust load testing. If you liked it so do give it a try and also read the full documentation here.
If you liked this story. Then you can buy me a coffee using this link. That will motivate me to write more blogs like this. Peace out.