Contact Us System Performance Testing

If you’re looking to purchase the source code, performance testing is one of the stages of the software technical audit that evaluates a system’s performance under various workload conditions.

Find out the methodology and stages of the system performance testing process we use.

Contact us

The key performance indicators for FinTech software

Response time

The time it takes for the system to respond to a payment request, including all processing steps such as authentication, authorization, and settlement.


The number of payment transactions that can be processed by the system per unit of time, such as per second or per minute.


The percentage of time the system is available for use by users and customers. This is critical for financial services, where any downtime can result in significant financial losses for both the FinTech company and its customers.


The ability of the system to operate without failure or errors over an extended period of time. This is critical for ensuring that payments are processed correctly and on time and that sensitive financial data is not lost or compromised.


The ability of the system to handle an increasing number of payment requests without compromising its performance or reliability. This is critical for FinTech companies that expect to grow and process more payments over time.


The level of security provided by the payment processing system to protect sensitive financial data and prevent fraudulent activities such as payment fraud or identity theft.

A step-by-step process for conducting system performance testing for FinTech software

Performance goals and acceptance criteria defining

Setting clear product performance objectives is essential in order to validate whether the system meets the performance requirements specified by stakeholders.

Some of the important metrics that should be determined prior to conducting performance testing:

  • Response time under a given workload (TPS level)
  • Number of concurrent user sessions
  • Acceptable error count/rate

In addition, a service level agreement (SLA) is required for certain metrics (e.g. response time should not exceed 500 ms, etc.)

Workload modeling

In this step, expected user behavior and system usage patterns are identified and modeled to create a test plan.

Along with the test plan, the preconditions (test data set) are defined (the system conditions required to execute the test plan).

Test environment setup

A test environment is set up to simulate production environments and test system performance in a controlled manner. This may include setting up virtual machines, configuring network settings, and installing the necessary software and hardware. 

At this stage, the scalability of the system is also tested. The workload for each configuration is defined, and the required number of environments is created to test different loads (by changing the amount of resources allocated to the application or the number of applications).

There exist a number of toolsets that can be used for this purpose. We at have chosen one of the possible variants and list the software requirements for its implementation below.

The necessary software that has to be configured

  • A Prometheus application as a monitoring system.

Its task is to collect the system metrics received from the system and store them for further analysis.

  • A Grafana k6 loading test tool.

It executes test scenarios and emulates the workload.

  • An InfluxDB database instance as the data storage. 

It is used for storing the metrics received from the K6 load test tool.

  • A Grafana application as an analytics and interactive visualization tool for the data coming from Prometheus and InfluxDB as data source.


Test scenarios creation

After analyzing the system requirements and the test plan, it is necessary to define the test scenarios that will be used to emulate the system workload. The required test scenarios will be executed through the K6 load test tool. It can be used to configure different scenarios, e.g. high user traffic, peak usage, steady-state usages etc., etc.

Test execution

The script with the pre-configured scenarios runs in the K6 tool to simulate the system workload. The K6 tool stores the test results in the database for further analysis with the Grafana application.

Test results analysis

The Grafana application is used to visualize the test results (metrics). This solution provides dashboards and charts that offer analytical insights into performance and identify performance bottlenecks and their causes.

Test reporting

A report is generated summarizing the performance test results, including any issues identified along with the root causes of those issues.

Get in touch to learn more about the platform technology

    By pressing “Send” button you confirm that you have read and accept our Privacy Policy and Terms & Conditions

    Help to win the War