AWS for SAP

SAP Load Testing: a serverless approach with AWS

Introduction

Conducting adequate load testing of SAP systems is a major factor in ensuring the systems can meet the performance and reliability expectations of the business when under peak usage. Typical scenarios which require load testing are new company/country rollouts, software release upgrades from ECC to S/4HANA, application patching (e.g., support packages), S/4HANA transformation projects or migration to SAP RISE. To ensure stable operations after such large scale changes it is recommended to perform load tests prior to any production cutover in order to avoid any potential performance related issues. With this blog, you will learn how to implement and use a load test platform on AWS to inject different types of load to an SAP ERP system that could be deployed on-premise or on RISE.

What is SAP Load Testing?

Load testing in SAP consists of systematically injecting various type of load on a system to measure its behavior under heavy load conditions. The process simulates multiple concurrent users accessing the system simultaneously, executing various SAP transactions, processing large volumes of data, testing business-critical processes while measuring response times and resource utilization for proper performance evaluation.

Critical Importance of Load Testing

The importance of SAP load testing cannot be overstated for several key reasons. From a business continuity perspective, it prevents system crashes during peak business hours, ensures critical business such as month end closing processes run smoothly, and maintains user productivity and satisfaction.

In terms of risk mitigation, load testing identifies performance bottlenecks before they impact operations, helps prevent costly system downtime, and reduces the risk of data processing errors. For resource optimization, it determines optimal hardware requirements, helps in capacity planning, and identifies areas for performance tuning. Regarding cost savings, proper load testing prevents over-provisioning of resources, reduces unexpected maintenance costs, and minimizes business disruptions. Without proper load testing, organizations risk system failures during critical business periods, loss of revenue due to system downtime, decreased user productivity, damaged business reputation, and increased maintenance costs.

Traditional Load Testing Challenges

The landscape of load testing has undergone a significant transformation in recent years. While established tools (e.g., the test automation suite offered by Tricentis – including LoadRunner) have long been the go-to solutions for SAP ecosystems, their traditional approach comes with considerable factors that many organizations find increasingly challenging to justify. Traditional load test tools are often expensive to procure and operate in terms of licenses, infrastructure and expert skills.

Serverless Load Testing with AWS

Modern load testing approaches have evolved to embrace serverless architectures, particularly using AWS native services. By leveraging services like AWS Lambda, Amazon EventBridge, AWS Batch, AWS Fargate, AWS Step Functions, AWS Systems Manager and Amazon CloudWatch, organizations can create scalable, cost-effective load testing solutions for SAP systems. This serverless approach eliminates the need for maintaining dedicated testing infrastructure and allows for on-demand test execution. The combination of AWS Step Functions to orchestrate the test scenarios and Amazon S3 or Amazon DynamoDB to store test results creates a robust, automated testing framework that can simulate thousands of concurrent users while providing detailed performance metrics and insights.

The benefits of serverless load testing are substantial: pay-per-use pricing model (which as detailed below for many services falls under the AWS Free Tier), zero infrastructure maintenance, automatic scaling capabilities, and significantly reduced complexity in test setup and execution.

When to Perform Load Tests

SAP Load testing is a key part of the overall non-functional testing that SAP customers must consider. Load testing should be performed before going live with a new SAP implementation, after major system upgrades or patching, when adding new business processes, before peak business periods such as year-end closing, and when planning for business growth.

Planned business events that typically generate exceptional system loads, such as month-end or year-end financial closing, require proactive performance assessment to ensure smooth operations. Additionally, when organizations plan to integrate new business entities or implement new complex processes, load testing becomes essential to validate the system’s capacity to handle the increased complexity and volume.

To maintain consistent system performance, organizations should embed load testing within their Change Management strategy. This systematic approach helps identify potential performance impacts from application modifications before they affect production environments. By integrating automated testing capabilities, organizations can more efficiently validate system performance while establishing a foundation for maintaining quality through future upgrades and integrations. This proactive stance on performance testing ultimately helps ensure system reliability and optimal user experience across the SAP landscape.

Solution Architecture

SAP on AWS Native

Figure 1: SAP on AWS Native architecture

Figure 1: SAP on AWS Native architecture

RISE with SAP

Figure 2: RISE architecture

Figure 2: RISE architecture

Note: some code adjustments might be required for extracting OS metrics from SAP systems running within RISE

Testing Scenarios

RISE Environment

Application Testing

Figure 3: RISE load test types

Figure 3: RISE load test types

HANA Database Load Testing

This open-source solution generates realistic database workloads for SAP HANA systems, injecting large database insert/update/delete sql operations using the k6-sql module and the k6 scripting capabilities. Here’s an example:

import sql from "k6/x/sql";
import driver from "k6/x/sql/driver/hdb";
import secrets from 'k6/secrets';

const db = sql.open(driver, `hdb://${username}:${password}@${hanaHost}:${hanaPort}`);

export function setup() {
  db.exec(`
  CREATE COLUMN TABLE test_table (
    A INT GENERATED BY DEFAULT AS IDENTITY,
    B TEXT,
    C TEXT,
    D TEXT
    );
  `);
}

export default function () {
  // first insert
  let insert_result = db.exec(`
    INSERT INTO test_table (B, C, D) 
    VALUES ('test', 'test', 'test');
  `);
  console.log("Row inserted");

  // then select
  let selectResult = db.query(`
    SELECT * FROM test_table 
    LIMIT 10;
  `);
  console.log(`Read ${selectResult.length} rows`);
}

export function teardown() {
   db.exec(`DROP TABLE test_table;`);
   db.close();
 }

The monitoring components, based on HANA native and Amazon CloudWatch, focus on transaction throughput and response times, capturing detailed resource utilization patterns across the system. It analyzes error rates and performance bottlenecks while providing insights into memory and CPU impact.

SAP Fiori and IDoc Load Testing

A k6-based framework designed for testing high-volume IDoc processing, ensuring robust business document exchange capabilities. The system supports dynamic IDoc payload generation for various document types, for example sales orders (refer to code repository for idoc examples), with configurable load patterns including ramp-up, sustained, and peak testing phases. Deep integration with AWS CloudWatch enables metrics collection and threshold-based performance validation.

Similarly, multiple parallel end user interactions via SAP Fiori frontend can be simulated using the same approach to test end user experience when system enters a heavy load state.

Performance monitoring focuses on two key areas. First, Fiori access or IDoc processing performance tracks throughput rates, processing times, queue behavior under load, and error patterns while validating end-to-end processing. Second, system impact analysis examines SAP response times, database performance, network utilization, and overall resource consumption patterns. Similarly to the k6 database scenario, a javascript script can be used for injecting sales order idocs to a SAP system or simulating SAP Fiori user interactions:

import http from 'k6/http';
import { check, sleep } from 'k6';
import { Rate } from 'k6/metrics';
import encoding from 'k6/encoding';
import secrets from 'k6/secrets';


//get information from secret
const username = await secrets.get('username');
const password = await secrets.get('password');
const sapClient = await secrets.get('sapClient');
//get baseUrl from environment variable
const sapBaseUrl = __ENV.SAP_BASE_URL
//set the sap client in url parameter
let sapClientStringParameter=""
if (sapClient.match(/^[0-9]{3}$/)) {
    sapClientStringParameter=`?sap-client=${sapClient}`
}

// define your url path
const urlPath="/sap/bc/idoc_xml"
//build the final url
const url = `${sapBaseUrl}${urlPath}${sapClientStringParameter}`;


//start your load test logic

const xmlfile = open('./sample_idoc_ID1.xml');
const todayDate = new Date().toISOString().slice(0, 10);
const newDate = todayDate.replace("-","")


export const successRate = new Rate('success');

export const options = {
  vus: 5,
  duration: '60s',
  insecureSkipTLSVerify: true,
};

export default function () {
    const data = getAndConvertIdocXml();
    const credentials = `${username}:${password}`;
    const encodedCredentials = encoding.b64encode(credentials);

    const httpOptions = {
        headers: {
          Authorization: `Basic ${encodedCredentials}`,
          "Content-Type": 'text/xml'
        },
      };

  check(http.post(url, data, httpOptions), {
    'status is 200': (r) => r.status == 200,
  }) || successRate.add(1);

  sleep(5);
}


function getAndConvertIdocXml() {
    let result = xmlfile.replace("{{GENERATED_IDOC_NUMBER}}", Math.floor(Math.random() * 100000000000000))
    result  = result.replace("{{GENERATED_MESSAGE_ID}}", Math.floor(Math.random() * 100000000000000))
    return result
  }

A sample xml idoc structure can be generated via transaction WE19 (idoc test tool). Once the idoc has been generated, the xml will need to be converted into a template with string placeholders GENERATED_IDOC_NUMBER, GENERATED_MESSAGE_ID to be filled at runtime when the load test script is executed.

Note: please ensure you perform the proper ALE configuration such as sender/receiver and partner nr/port are set with the right values, according to your SAP system, for successful inbound IDoc processing with status 53 (Application document posted).

Both testing approaches deliver significant advantages through realistic business process simulation and scalable test scenarios. The comprehensive monitoring framework, coupled with cost-effective open-source tools, provides deep integration with AWS services for enhanced visibility and control.

SAP on AWS Native Environment

In addition to the RISE environment options, the following tests can be performed when running SAP Natively on AWS

Infrastructure Testing

Figure 4: infrastructure load test types

Figure 4: infrastructure load test types

The image shows three main testing scenarios for SAP systems on AWS, focusing on different infrastructure aspects:

Compute Testing Scenario:

This scenario evaluates the computational aspects of SAP systems through systematic stress testing using operating system tools. For CPU load simulation, similarly to the AWS Fault Injection Simulator (FIS) , tools like ‘stress-ng‘ are employed to generate precise CPU utilization patterns. For example using stress-ng to maintain 75% CPU load across all cores, or memory stress testing to consume 80% of available RAM. A tutorial can be found here. These tools, combined with SAP’s native performance metrics (e.g., gathered by sapsocol / Workload Monitor), provide comprehensive insights into instance type comparison between Intel and AMD processors, helping determine optimal compute configurations for specific SAP workloads. The testing methodology includes gradual load increments, sustained high-utilization periods, and monitoring of system behavior under various computational stress conditions.

Storage Testing Scenario:

This scenario focuses on storage performance optimization through comprehensive testing using the Flexible I/O (FIO) tester. FIO enables precise measurement of storage performance characteristics through various test patterns such as Random Read/Write Testing, Sequential Read Performance and IOPS Testing for EBS volumes. For further details on FIO benchmarking on AWS please see this link. FIO tests are executed across different EBS volume types (e.g., gp3 vs io2) to analyze:

  • Maximum throughput capabilities
  • IOPS performance under various workloads
  • Latency patterns at different queue depths
  • Storage system behavior under sustained load
Networking Testing Scenario:

This scenario examines network performance and connectivity by measuring critical network parameters that affect SAP system performance. A key feature is the network delay simulation using operating system commands like ‘tc‘ (traffic control) in Linux. This allows precise control over network conditions by introducing artificial latency, packet loss, and bandwidth limitations. For instance, using commands such as ‘tc qdisc’ enables the simulation of real-world network conditions, including:

  • Introducing specific latency values (e.g., 300ms delay)
  • Simulating packet loss scenarios (e.g., 5% packet loss)
  • Creating bandwidth throttling conditions
  • Implementing jitter in network communications

These network impairment simulations provide valuable insights into how SAP systems behave under various network conditions. This approach helps organizations understand their SAP system’s resilience to network issues and optimize their network configurations accordingly. This scenario can be used to introduce additional latency across SAP interconnected systems (e.g, ERP ↔ BW) and measure data extraction processes, by simulating a migration project where ERP runs on premise and BW is migrated to AWS.

Load Test Monitoring

Real-time Monitoring Options

CloudWatch Dashboards

Custom Amazon CloudWatch dashboards present critical performance metrics in a unified view, combining infrastructure metrics, application performance data, and custom test metrics into comprehensive visualizations. Key performance indicators are organized into logical groups: SAP technical metrics such as Dialog and Database Response Time, number of system dumps, active users for SAP operations teams, and OS metrics such as CPU, memory usage, storage IOPS and throughput for Infrastructure operations teams. The dashboards feature automated alerting based on predefined thresholds, enabling proactive response to performance issues during test execution. Historical data retention allows trend analysis and performance comparison across different test runs, providing valuable insights for capacity planning and system optimization.

Figure 5: Cloudwatch dashboard metrics during load tests

Figure 5: Cloudwatch dashboard metrics during load tests

Amazon Managed Grafana (optional)

As an alternative or complement to CloudWatch dashboards, Amazon Managed Grafana offers enhanced visualization capabilities and deeper analytical features. The service enhances the monitoring experience through advanced data correlation and custom metrics, providing rich visualization options with both pre-built and custom panels. It supports cross-account and cross-region metric aggregation, enabling comprehensive monitoring of complex SAP landscapes. Team-based access control and dashboard sharing facilitate collaboration across different stakeholder groups, while native integration with AWS services (in this scenario Amazon CloudWatch) and external data sources expands monitoring capabilities. Real-time metric exploration and ad-hoc analysis capabilities support immediate investigation of performance issues, complemented by automated dashboard provisioning through infrastructure as code. The built-in alerting and notification channels ensure timely response to performance anomalies. The combination of CloudWatch metrics and Grafana visualizations creates a powerful monitoring ecosystem that supports both real-time operational monitoring and long-term performance analysis, enabling data-driven decisions for SAP system optimization.

Figure 6: Grafana dashboard metrics during load tests

Figure 6: Grafana dashboard metrics during load tests

In both cases SAP Netweaver specific metrics can be gathered in Cloudwatch by using this open source solution.

Load Testing Workflow

Orchestration and Execution Flow

The workflow follows a structured sequence of events across AWS services to execute and monitor SAP system load tests:

  1. Test Initiation: a Step Function initiates the testing workflow based on end user input in a dedicated web application UI, orchestrating the entire process through a state machine that controls test execution and monitoring. Input payload determines which scenario will be executed (SAP, HANA or Infrastructure).
  2. Test Execution: specific Lambda functions, authenticating via AWS Secrets Manager, execute the load testing scenarios. The functions contain the logic and parameters for specific testing scenarios .
  3. System Access: AWS Systems Manager and Fargate provide secure access to the SAP landscape within the VPC, enabling test execution across:
    • SAP Application Servers for application-level testing
    • SAP HANA database for database level testing
    • Operating System level for CPU, memory, storage and network testing
  1. Monitoring and Data Collection:
    • Amazon CloudWatch collects and processes performance metrics from the test execution, gathering data from all SAP components and infrastructure elements.
  1. Realtime Visualization Options:
    • Direct CloudWatch dashboards for real-time monitoring
    • Amazon Managed Grafana (optional) for advanced visualization and analysis, with optional integration with AWS IAM Identity Center (successor to AWS Single Sign-On) for authentication
  1. Long Term Data Processing & Analysis:
    • Performance metrics are stored in Amazon S3
    • AWS Glue processes and transforms the data
    • Amazon Athena enables SQL-based analysis of the results
    • Amazon QuickSight (optional) generates performance insights and visualizations

To further improve and streamline the end user experience, a react application has been developed where users, securely authenticated via Amazon Cognito, can start load test directly and monitor the status. This is hosted on Amazon CloudFront and through Amazon API Gateway interacts with the Step Function workflows

This workflow ensures comprehensive test execution, monitoring, and analysis while maintaining security through VPC isolation and proper access controls.

Implementation

Get started today:

  • Clone the GitHub repository
  • Follow our step-by-step deployment guide
  • Run your first load test in your SAP system – from setup to results – in under 60 minutes!

Costs

Service Cost Description
Step Functions Free Tier Includes 4,000 free state transitions per month
Lambda Free Tier One million free requests per month and 400,000 GB-seconds of compute time per month
Secrets Manager $0.40 per secret per month 1 secret required for storing user credentials and other parameters
Systems Manager Free No additional charges for run command
Cloudwatch $3.00 Basic Monitoring Metrics + 10 Custom Metrics
Cloudfront $0.12 per month
Cognito $6.22 per month 5 users with 100 token requests per month and 1 app client
ECR $0.07 per month 2 images, 750 MB total per month
Fargate $6.32 per month 16 vCPUs 16 GB memory, 10 tasks per pod per month, 60 minutes duration
API Gateway $0.35 per month 10000 monthly REST api requests
S3 Free Tier 5GB storage in S3 Standard, 20,000 GET Requests, 2,000 PUT/COPY/POST/LIST Requests, 100 GB Data Transfer Out monthly
Glue Free Tier Free for the first million objects stored
Athena $0.29 per month 20 queries per day, 100 MB scanned data per query
Managed Grafana $9.00 per month (optional) 1 editor, 5 users
Quicksight $33 per month (optional) 5 users
Total $16.77 per month without Grafana / Quicksight
$25.77 per month with Quicksight
$58.77 per month with Grafana and Quicksight
In case only reporting is required, Quicksight can be used. As alternative to Grafana, Cloudwatch dashboards can also be used (with some limitations)

Please find more details on costs at this link

Conclusions

In this comprehensive blog post, we’ve explored how to leverage AWS serverless services to perform SAP load and performance testing. By adopting cloud-native services, we’ve demonstrated an efficient and cost effective approach to conduct extensive load tests for SAP environments while gathering deep, actionable metrics. Our approach showcases how modern cloud capabilities can transform traditional performance testing into a more scalable, cost-effective, and data-driven process for SAP deployments.

Read more on AWS for SAP blogs to get inspiration on how you can get more out of your SAP investment. Get started with SAP on AWS today and start your load testing strategy in your SAP landscape!

Join the SAP on AWS Discussion

In addition to your customer account team and AWS Support channels, AWS provides public question and answer forums on our re:Post Site. Our AWS for SAP Solution Architecture team regularly monitor the AWS for SAP topic for discussion and questions that could be answered to assist our customers and partners. If your question is not support-related, consider joining the discussion over at re:Post and adding to the community knowledge base.