Everything changed when the pandemic attacked
These days it’s impossible to go five minutes without hearing or reading about coronavirus. I know, I know, it’s exhausting and I hate to be the one to add to the ever-growing number of articles that at least mention this topic, but I wanted to address a particular concern that our client had and which was brought on by the pandemic.
Due to the current circumstances, even those rare parts of our lives that managed to stay offline recently had to move online. It’s no surprise then that the average Internet usage per day has gone up significantly compared to, let’s say, three months ago. Everybody and their mother is on the lookout for the latest information about the virus. Those stuck at home who aren’t working have the perfect opportunity to binge their favorite tv shows all day, while those lucky enough to be able to work from home are forced to replace their face-to-face meetings with online ones. Perhaps the biggest shock came to those who are still in school because their academic life (including lectures and exams) has entirely migrated to the world wide web overnight.
This is where our client comes into play. As the owner of a popular online course platform, they were aware that, during the exam period, their platform would be exposed to increased traffic. Since it was imperative that the students can access their video lessons during an exam, the client wanted some reassurance that their app (which is hosted on Heroku) would be able to handle the potential 3000+ students accessing a particular course at the same time.
Defining the testing requirements and goals
Let me point out that they didn’t need an enterprise-level load test nor an in-depth analysis of the app’s performance and possible bottlenecks. Therefore, what we looked for in a testing framework was the ability to set it up and run it as simply as possible. Besides that, we wanted to perform our load test in a real browser, not locally or in a phantom/headless browser (which many load test tools use). Ultimately, the client’s concern wasn’t the just app itself, but perhaps more importantly, its Heroku setup. Is the upper limit that we have set for dyno autoscaling high enough? Are we in danger of running out of memory? What about running out of available database connections?
Now, there are many load testing tools available. Just a quick Google search will give you a plethora of results: LoadView, LoadNinja, SmartMeter.io, WebLOAD, NeoLoad... the list goes on and on. Many of these tools, however, require either a complicated setup or a fee, both of which we wanted to avoid. Some even require you to write your load tests in the form of code, such as Gatling. It goes without saying that this would take a lot of our time, and as such was out of scope for the task at hand.
Keep it simple, stupid
Probably every developer is familiar with this saying, as it is our guiding philosophy. Driven by this, we went for a wonderfully simple, yet powerful solution - Apache Bench. Let me tell you what’s so great about this particular tool. First of all, as a macOS user, I already had Apache installed(including the Bench tool). You’ll agree with me that setup can’t get any faster than that. Secondly, it’s an amazingly versatile tool due to its many configuration options - you can use it in combination with Basic Auth, you can set up custom headers, cookies, etc. What makes it especially reliable is the fact that it allows you to test your app with concurrent requests. Not only does this allow a more natural simulation than receiving requests one by one, but it also helps you identify potential race conditions. All you have to do is run a simple command in the terminal, and depending on the size and complexity of your test, you can get your results in mere minutes.
Even at its least verbose level, you get stats on things like the number of completed and failed requests, mean time per request, mean time across all concurrent requests, transfer rate, total byte size transferred, min / mean / median / max times spent on connecting, processing and waiting, as well as percentile ranking for the longest 50% of requests.
It’s such a remarkably simple tool that enables you to get results as close to the real-user performance as possible, which allows it to compete with a lot more sophisticated tools. Of course, it lacks anything other than the basic benchmarking functionality, but if you’re someone who’s testing their app for the first time, Apache Bench is a great place to start and collect some preliminary metrics. In our case, it only took a couple of attempts before we managed to tweak the parameters to our liking and were able to reliably test our client’s app.
Be mindful of your goals
Like I’ve mentioned in the beginning, Apache Bench won’t give you the most thorough analysis, but in this case, it was all we needed. There are plenty of other reasons why you could feel the need for a load test. For example, you’ve noticed that your app is slow during certain time windows or while performing certain operations. Perhaps you might be adding new features, but you want to make sure that the app performance is not being negatively affected. You could be seeking to identify bottlenecks and improve the overall speed of your app. If your app is also hosted on Heroku, perhaps you’re searching for the right dyno configuration, or you’re trying to avoid Heroku overload scenarios. Whatever your reasoning is, keep it in mind while searching for the right tool because it will greatly impact your decision.