NorthCode logo

Case: Figure 1

Backend software performance and load testing for Figure 1
The output was exactly what we were looking for - it's a Pleasure to work with a team that knows exactly how to approach a problem and can solve it with minimal guidance.
- Brandon Lyn, Engineering Director, Figure 1

Our client’s brief was to subject their cloud backend software to performance and load testing, with special attention placed on the RestAPI interface. Figure 1 also felt it was essential to find out whether the scaling of instances and the Kubernetes cluster could be a viable solution to serve larger customer flows, as well as to assess which kind of modifications the implementation of automatic scaling would require within their system.

Together we decided to bypass public interfaces as it wasn’t in the client’s best interest to increase the cost of their cloud infrastructure - thus a conscious decision was made to carry out the testing in actual practice.

How was this accomplished?

We set up a distributed JMeter cluster on top of each node created for load testing within the client’s Kubernetes cluster. With this distributed cluster, we were able to respectively generate load from various sources and mimic scenarios where users joined in from across the world.

The client outlined how their front-end created load on their backend RestAPI during end-2-end-style use. Based on gathered data, we created a series of test cases which we used throughout the load testing. This test case series was then parameterized so it could be driven under different load scenarios quite comprehensively with ease.

To avoid the use of public interfaces, in order to drive the load, performance testing was carried out using Kubernetes' Services.

Analysis and reporting using the Grafana dashboard

An integral part of the process was to integrate data from JMeter into the InfluxDB time series database. This was used to create a Grafana-Dashboard, which combined JMeter and the client’s Kubernetes, cloud, database and system-wide monitoring data.

JMeter also produced a CSV report that contained accurate load data as well as vast system response data. The CSV report was then able to produce an html site with JMeter, where more detailed graphs and data were shown in a readable format.

Once the Grafana dashboards had been developed satisfactorily and provided with ample data (from different regions of the system with additional information obtained from JMeter), it provided a clear and easy opportunity to perform an extensive analysis of the client’s backend system during loading and under varying load scenarios.

Scripting: Easy for the client to continue testing

The load infrastructure setup, initialization, and removal processes were scripted to be as parametric and effortless to use as possible. This was to ensure that the client could run the same load scenarios against a newer software version and determine whether each change made the software more efficient in terms of load and stabilization. It also allowed diverse test scenarios to run in a clean environment and made test case development noticeably smoother.

Effortless cooperation

Due to locational time differences, an important goal for our load testing team was to do the work as independently as possible.

It is important to note that being granted extensive rights to all of Figure 1’s internal environment enabled us to establish smooth and uniform runs throughout the project.

We also created our own authentication service for Kubernetes. It allowed us to retrieve the required token to access RestAPI at any stage. Even though creating authentication software is time consuming, it proved to be extremely valuable throughout the development of our test cases - turning a complex process into a very simple and fluent one.

As we had to cope with the time difference between the client and our team, meetings were mainly held during the evenings and steered by well-prepared meeting agendas. We felt it was highly important to demonstrate all evidence of progress to our client during each session. Slack was used for communication.

Added value was attained

Figure 1 was very pleased with the outcome of the project, as well as the positive collaboration and professionalism of our NorthCode employees.

NorthCode’s performance and load testing delivered precisely the added value Figure 1 initially sought to attain. Our testing experts were praised particularly for their independent approach, as it freed up valuable time and energy for our client to implement elsewhere beneficially.