Skip to main content

AWS Transform

AWS Transform FAQs

General

Open all

AWS Transform is an agentic AI service designed to support large-scale modernization of full-stack Windows workloads (including .NET and SQL Server), transformation of mainframe applications to modern languages and architectures of VMware workloads to Amazon EC2, and custom transformations for code, APIs, frameworks, and more.

You can access AWS Transform’s unified web experience tailored for large-scale modernization and team collaboration at https://console.aws.amazon.com/transform/home. For custom transformations for code, APIs, frameworks, and more, the service operates through both CLI and web interfaces. For select .NET applications requiring developer attention, developers can also use AWS Transform in Visual Studio IDE.

AWS Transform’s approach for migration and modernization differs from traditional tools on three fundamentals. First, AWS Transform offers specialized task agents for various tasks – ranging from network generation to extracting business rules from COBOL to porting .NET code. These agents combine specialized knowledge built on years of experience with enterprise-specific context. Second, the service uses agentic AI to orchestrate execution of these expert task agents, unique to each workload. Depending on the task, orchestration ranges from deterministic execution to goal driven, dynamic plans. The product focuses on getting the jobs done, integrating with humans in the loop or invoking coding agents. Third, learning capability is built-in at each level. The agents continually self-debug and improve outcomes and provide recommendations for next steps.

To get started, sign in to the AWS Transform web experience with your current enterprise credentials. If you are a new customer, you can use single sign-on (SSO) with AWS IAM Identity Center integration and connect it to an AWS account to get started. Alternatively, you can set up direct federation with Okta or Microsoft Entra. To learn more, see AWS Transform User Guide.

Assessment

Open all

AWS Transform assessments analyzes your IT environment to simplify and optimize your cloud journey with intelligent, data-driven insights and actionable recommendations. Discover cost and performance optimization opportunities while getting detailed financial modeling to help you confidently plan your migration and maximize potential savings.

The workflow begins with uploading your existing server inventory to the AWS Transform platform. Once your data is in place, you have the opportunity to specify your target AWS Region. Next, you can then instruct AWS Transform to generate your business case. AWS Transform analyzes your server inventory and identifies the most suitable and cost-effective Amazon EC2 instances for each one. The resulting business case provides you with a clear, data-driven projection of how your current on-premises environment could map to AWS services, offering valuable insights for your migration planning and decision-making process.

AWS Transform supports a variety of data collection methods for x86 servers, whether virtual or physical, from on-premises environments. The service accepts server inventory data from several widely used assessment tools. These include exports from RVTools, data collected through AWS Transform discovery tool or the AWS Migration Evaluator agentless collector, and AWS Migration Portfolio Assessment (MPA) exports generated by tools like modelizeIT and Cloudamize.

After the completion of the assessment job, AWS Transform provides a summary of the assessment, an opportunity to ask questions about the cost and recommendations and the option to download a PDF version of the business case for offline review and sharing.

The business case includes key highlights from the server inventory, a summary of current infrastructure, and multiple total cost of ownership (TCO) scenarios with varying purchase commitments (on-demand and reserved instances), operating system licensing options (bring your own licenses and license-included) and tenancy options (dedicated and shared). The business case also includes actionable next step recommendations.

AWS Transform assessments provide directional estimates that approximate the cost of AWS services based on your current server configurations and assumed usage patterns. While these estimates are helpful for initial planning purposes, they should be viewed as guidance rather than exact figures. Actual AWS costs may vary depending on your specific implementation, resource optimization choices, and real-world usage patterns. It's important to note that these estimates are not quotes and should not be interpreted as guarantees of your final AWS service costs. For more precise cost planning, we recommend working with your AWS account team or an AWS Partner who can help analyze your specific requirements and usage patterns in detail.

AWS Transform Assessment and AWS Migration Evaluator are both valuable tools for planning cloud migrations. Assessments is a fast, self-service capability of AWS Transform, designed specifically for organizations looking to migrate x86 servers from on-premises environments to AWS. It utilizes existing server inventory data to provide targeted recommendations for Amazon EC2 instances and generate quick TCO estimates. This streamlined approach is ideal for companies seeking a rapid, focused assessment of their migration options. AWS Migration Evaluator offers a more comprehensive, expert-led assessment service. Guided by AWS Solutions Architects, this in-depth evaluation encompasses a broader range of analyses, including detailed data collection, storage assessment, sustainability evaluation, and Microsoft SQL Server analysis. Migration Evaluator is best suited for organizations that require thorough migration planning and desire expert guidance throughout the process.

AWS Transform has a built-in AI chat capability so you can ask for more details or clarification about instance mapping, licensing and tenancy suggestions, and next step recommendations. For further support or additional analysis for other workload types, engage with your account team or partner or contact us.

Windows

Open all

AWS Transform enables you to accelerate transformation time by 5x compared to manual porting and reduce operating costs by as much as 70%. The service achieves this through simultaneous transformation of hundreds of applications and Microsoft SQL Server databases to Amazon Aurora PostgreSQL, with human-in-the-loop (HITL) supervision. Transformed applications can be deployed as containers on Amazon EC2 or Amazon ECS, and databases to Amazon Aurora PostgreSQL clusters.

AWS Transform for Windows includes two main components: transforming .NET Framework applications to cross-platform .NET and migrating Microsoft SQL Server to Aurora PostgreSQL databases along with the dependent .NET application. 

AWS Transform for .NET accelerates modernization of Windows-based .NET Framework applications to cross-platform .NET for Linux environments. It connects to your source code repositories in GitHub, GitLab, Azure Repos, or Bitbucket and performs a comprehensive analysis focused on three key areas: repository dependencies, required private packages and third-party libraries, as well as identifying supported project types. Based on this analysis, it generates a transformation plan for these repositories and highlights any missing dependencies that you can resolve by uploading packages yourself. During the transformation process, AWS Transform for .NET converts application code, builds the output, runs unit tests, and commits results to a new branch in your repository. You can then deploy the transformed application as a container on Amazon EC2 or Amazon ECS.

AWS Transform for SQL Server modernization accelerates migration of your Microsoft SQL Server databases and applications to Aurora PostgreSQL. It connects to your SQL Server databases running on Amazon EC2 or Amazon RDS to discover the schemas and the stored procedures in your databases. It then performs a detailed analysis of databases and applications to create waves of applications, and databases that can be transformed together based on dependency relationships. It then transforms SQL Server schemas to Aurora PostgreSQL and migrates databases to new or existing Aurora PostgreSQL target clusters. For .NET applications transformation, the service updates database connections in the source code and modifies ORM code in Entity Framework and ADO.NET to be compatible with Aurora PostgreSQL — all in a unified workflow with human supervision.

In both workflows, AWS Transform provides a comprehensive transformation summary, including modified files, test outcomes, and suggested fixes for any remaining work. Your teams can track transformation status through its interactive chat or worklogs. Additionally, your teams receive email notifications with links to transformed .NET code in your repositories. For workloads that need further processing, your developers can continue using the Visual Studio extension in AWS Transform.

AWS Transform for Windows discovers the repositories in your account and identifies supported project types in each repo. It supports porting console applications, class libraries, Web APIs, WCF Services, Model View Controller (MVC), Single Page Application (SPA), and unit test projects (xUnit, NUnit, MSTest frameworks) to cross-platform .NET (full list available here). In addition, AWS Transform for Windows also ports MVC Razor views UI projects to ASP.NET Core Razor views, UI porting of ASP.NET Web Forms to Blazor on ASP.NET Core, porting Entity Framework and ADO.NET ORM code for Aurora PostgreSQL compatibility, porting of WinForms, WPF, and Xamarin projects to cross-platform .NET, and support for VB.NET language projects.

After identifying project types, it analyzes these projects for dependencies on other projects, private packages, and third-party libraries. Based on the dependency analysis, AWS Transform for Windows recommends a transformation plan that orders repositories according to their last modification dates, dependency relationships, and private package requirements.

You can download the analysis report to evaluate the recommended plan and review it with your team. You also have the option to customize the recommended plan by editing the selection in the console or by uploading a modified file with your preferred selection. Administrators and approvers can review and approve the plan before proceeding with the transformation process.

During transformation, the selected source code repositories from your approved plan are securely fetched into a network-isolated execution environment for transformation to cross-platform .NET. AWS Transform for .NET supports transforming applications written using .NET Framework versions 3.5+, .NET Core 3.1, .NET 5, .NET 6, and .NET 7 to cross-platform .NET 8 (LTS), and .NET 10, and database access frameworks Entity Framework, and ADO.NET.

After porting, AWS Transform runs a full .NET build to identify any build errors and runs an AI-led evaluation loop to auto-remediate issues. This process is repeated across all supported projects within the repositories. After the transformation job is completed, the transformed code is committed back to your source code repository in your chosen target branch for review.

For .NET source-code repositories that have successfully completed transformations with zero build errors, AWS Transform executes unit test projects, if present, and provides those execution results for your review. For repositories that have partially transformed projects, unit test projects are ported but are not run. You can resolve remaining issues yourself before running the unit tests.

AWS Transform also supports deploying the transformed applications to a target environment for customers to validate the transformed applications.

AWS Transform for Windows first discovers the databases that are running in your AWS account. It then identifies the databases running the Servers, schemas, and stored procedures associated with the databases. It also analyzes the source code repositories to identify database dependencies in the repos, embedded SQL queries, and database access code written in Entity Framework, and ADO.NET. Based on the analysis, it then creates customizable wave plans for databases and application transformation so they can be transformed together.

You can download the analysis report to review the modernization recommendations, complexity of the transformations, and the databases, and source code repository invoking the databases.

AWS Transform for Windows transforms SQL Server in 3 steps: 1) Schema conversion, 2) Data migration, and 3) Code transformation.

During database schema conversion, the schemas from the selected databases are converted from Microsoft SQL Server schemas to Aurora PostgreSQL-compatible schemas. If there are issues in the schema conversion, AWS Transform for Windows will automatically run an AI-led evaluation loop to auto-remediate the issues. The process is repeated across all the schemas in the databases in the wave. Similarly, if there are stored procedures in the SQL Server databases, they are ported to be compatible with Aurora PostgreSQL databases as well. Once the schemas are successfully converted, they will be applied to the target Aurora PostgreSQL databases.

After the schemas are fully transformed for your target PostgreSQL database, you have the option to migrate your data from your SQL Server databases to Aurora PostgreSQL databases. During this stage, AWS Transform for Windows migrates your data to the transformed PostgreSQL databases. If there are any issues during the migration process, you will be informed of the migration issues and a migration report to troubleshoot the failures.

Finally, the source code repositories are updated to match the PostgreSQL target database created. The connection strings are updated to match the PostgreSQL database, embedded SQL code is ported to be compatible with PostgreSQL, and Entity Framework and ADO.NET are updated to match the new database. After the transformation is completed, the updates are committed to a new source code repository branch that you had provided. You can review a detailed transformation summary of the updates that AWS Transform performed during this step.

For .NET code transformations, you can track all modification actions through detailed transformation reports provided for each repository in natural language. These reports outline the files, APIs, and private NuGet packages that were modified, moved, or updated during the process. When repositories are partially transformed, the summary report includes specific details about build errors and schema transformation failures, along with recommendations for resolving these issues. All transformed source code is committed to a new target branch that you specify during the job, allowing you to check out the branch and review the code changes performed by AWS Transform.

For SQL Server modernization, you can monitor schema conversion and data migration actions through reports available after the transformation steps are completed. These reports are accessible both immediately after transformation and through the Migration Project page in the AWS Data Migration Service (AWS DMS) Console. Similar to .NET transformations, you can track source code changes from the feature branch. Additionally, you can validate the transformation results by examining the deployed database schemas and stored procedures in your target PostgreSQL database.

In the web experience, you can monitor transformation progress in real time through two main methods. The interactive chat provides dynamic updates and responses based on the current job plan and context of your questions, accessing a comprehensive knowledge base about ongoing jobs and actions. The worklogs offer detailed documentation of all actions performed by AWS Transform for Windows on your source code and databases, including user approvals and audit trails.

In Visual Studio IDE experience, when transforming .NET applications in Visual Studio, progress monitoring is available through the AWS Transform Hub. This interface displays the estimated remaining time, detailed transformation steps, and an activity worklog.

Additionally, you'll receive comprehensive transformation summary reports for each repository, detailing modified files, API changes, and updates to private NuGet packages.

Upon job completion, you'll receive an email notification containing deep links to review the transformed repositories. 

For .NET code transformations: AWS Transform provides a detailed transformation summary report including a next steps markdown file that outlines remaining tasks, such as Linux readiness issues and database access code updates. You can either use this information to initiate another transformation with AWS Transform or use it as guidance for an AI code companion.

For SQL schema conversion and data migration: The schema conversion report shows the percentage of successfully transformed schema and provides guidance for completing unfinished work. You can address remaining schema conversions using either the AWS Database Migration Service (AWS DMS) console's schema conversion page or IDEs like DBeaver. For data migration errors, you can review the data migration report to address the migration issues.

You are the owner of the code ported by AWS Transform for full-stack Windows modernization. Once the porting of source code is completed, the transformed code is committed to a branch of choice in your repository. AWS Transform does not store any copy of the transformed code after the code has been committed to the branch.

The same ownership principle applies to database schemas transformed using AWS Transform and AWS DMS. You own all converted schemas and can download, modify, and upload them to your target database. AWS Transform does not retain any schema information after job completion.

The AWS Transform .NET agent gets access to your source code through AWS CodeConnections service, which must be approved by an IT admin for your AWS account prior to accessing the source code. It then analyzes your code to identify inter-project dependencies and private packages used within the projects to recommend a transformation plan. The service is designed to securely and ephemerally clone your .NET solution, allowing you to use customer managed KMS keys  for encrypting your code in this environment. Customer managed KMS keys  allows you to have full control over keys, including managing policies, grants, tags, and aliases for accessing data.

Your source code processed by AWS Transform is stored only for the duration of the job and purged after the job is completed. Your trust, privacy, and security of your content are our highest priority. We implement appropriate controls, including encryption in transit, to prevent unauthorized access to or disclosure of your content and ensure that our use complies with our commitments to you.

AWS Transform securely analyzes your database schemas through a database connector, requiring explicit IT admin approval from your AWS account. Similarly, access to source code repositories is managed through AWS CodeConnections service, also requiring IT admin approval.

Database access is secured through AWS secret keys and user credentials that you provide to the AWS Transform agent. During schema conversion, the transformed schemas are deployed directly to your target Aurora PostgreSQL database within your specified AWS account, VPC, and subnet.

AWS Transform maintains strict security protocols throughout the process, never storing database information permanently. All database conversion information is deleted after job completion, and transformed code is committed only to your designated feature branch without any retention after the job is finished. This process ensures your database code and schemas remain secure throughout the transformation process while maintaining complete control within your AWS environment.

Mainframe

Open all

AWS Transform for mainframe is an agentic AI-powered service designed to accelerate the modernization of legacy mainframe applications. Customers can define high-level modernization goals and leverage a specialized AI agent to orchestrate the necessary tools and processes. The agent analyzes applications, generates documentation, extracts business logic, decomposes monolithic structures, transforms legacy code, automates testing, and manages modernization tasks, offering human-in-the-loop oversight where desired.

Key capabilities of AWS Transform include flexible, goal-driven planning, classification of application assets, planning and documentation generation with business logic extraction, comprehensive testing capabilities, automated refactoring that converts COBOL-based mainframe workloads into modern, cloud-optimized Java applications, and AI-powered reimagination capabilities.

AWS Transform empowers customers to modernize their critical mainframe applications faster, more cost-effectively, and with confidence that their business-critical logic will be preserved throughout the transformation process.

AWS Transform for mainframe supports both reimagine and refactor modernization patterns, offering flexible pathways to modernize legacy mainframe applications.

Refactoring with AWS Transform automates the transformation of COBOL-based mainframe applications into modern Java applications running on AWS, using agentic AI to analyze codebases, generate documentation, decompose monoliths, plan modernization waves, automate testing functions, and accelerate code refactoring while maintaining functional equivalence to the legacy stack.

Reimagining with AWS Transform enables transformation of mainframe applications to cloud-native architectures, leveraging automated analysis to convert monolithic applications into modern, agile solutions that can fully utilize cloud-native capabilities. Through a chat-centric, flexible agent experience, AWS Transform analyzes code and data, extracting information for technical and business documentation that drive the forward engineering of reimagined workloads.

A key feature of AWS Transform is its ability to break down monolithic mainframe applications into modular, business-aligned domains, and then generate comprehensive modernization waves. Using the business logic extraction in conjunction with the decomposition step helps break down monoliths into logical business domains.

Leveraging automated reasoning and planning capabilities, AWS Transform analyzes your codebase, identifies discrete functional areas, and organizes the application assets accordingly. It then creates detailed, prioritized modernization plans that consider factors like business priorities, technical complexity, and constraints. Through data and activity analysis, AWS Transform can also help identify application components with low utilization or minimal business value, enabling more informed decisions about target architecture.

This domain-driven decomposition and thoughtful planning allows you to tackle the modernization in manageable, iterative steps. By providing this visibility and structure up front, AWS Transform empowers you to focus your efforts, make informed decisions, and execute the modernization quicker.

AWS Transform for mainframe offers testing capabilities designed to reduce the time and effort required for mainframe modernization testing, which typically consumes over 50% of project duration. This includes automated test plan generation, test data collection scripts creation, and test case automation scripts creation. The service also includes a refactored functional test environment with tools for continuous regression testing, data migration, and results variation.

These agentic AI-powered capabilities work together to reduce dependency on scarce mainframe expertise, accelerate testing timelines, and improve accuracy through automation, helping customers modernize their mainframe applications with greater confidence and efficiency.

Yes, AWS Transform for mainframe is modular, allowing you to leverage its capabilities for as many or as few phases of the modernization journey as you choose. For example, when reimagining an application, you might initially focus on analysis across codebase, data structures, and activity, and later layer in documentation to inform the forward engineering of the reimagined application.

Inventory collection encompasses various mainframe components including COBOL programs, copybooks, Job Control Language (JCL), procedures and parameter cards, and DB2 definitions. If available, Customer Information Control System (CICS), Information Management System Transaction Manager (IMS TM), and CSD files should be loaded to determine entry points.

The extraction process begins by downloading source elements through text mode, converting each member into individual source files. Files should be organized in a structured directory system that reflects their origin, language, type, and application/sub-application relationships (for example, C:\Mainframe\APP1\Cobol\Program1.CBL or \Mainframe\APP1\JCL\JCL1.txt). If no file extension is provided, AWS Transform will determine the appropriate extension based on the file contents to classify the member.

The collected inventory is then compressed into a zip file and uploaded to an S3 bucket. The process might be iterative, with an initial upload followed by subsequent iterations of missing components until reaching satisfactory completeness.

After code transformation, you have the option to use pre-built Infrastructure as Code (IaC) templates to deploy your modernized applications. These templates are accessible through the AWS Transform chat interface, helping create the necessary compute resources, databases, storage, and security controls. Templates are available in AWS CloudFormation, AWS CDK, and Terraform formats.

You can also use the Reforge step to enhance your transformed Java code with improved readability and maintainability before deployment. To use this feature, provide your refactored code and java class list to specify which service classes to reforge. AWS Transform will generate downloadable files containing the reforged code.

AWS Transform provides the ability to specify files within your source code to generate documentation. You can choose summary overviews of file collections or detailed functional specifications for each file. The detailed specifications include logic flows, input/output processing, and other transactional details.

Once generated, you can access this documentation by viewing files in the AWS Transform interface or downloading them in XML or PDF formats. Additionally, the AWS Transform chat function allows you to query the documentation to better understand your documents, such as asking about specific file purposes or functionality.

The Analyze step, required for all mainframe jobs, examines source code provided in the S3 bucket and generates several key insights. AWS Transform classifies file types and provides metrics including total lines of code, comment lines of code, effective lines of code, and cyclomatic complexity (representing the number of linearly independent paths through program's source code). The analysis identifies missing and duplicated files, including files that share the same name or program ID. It also generates dependency mapping between files to be used during the decomposition step. This information helps you understand the state of your source code before proceeding with modernization.

AWS Transform makes mainframe modernization more accessible to business stakeholders through automated business logic extraction. This capability extracts from the source code the business rules, the functional groups, the entry points helping stakeholders to retrieve the lost knowledge about the business logic of their application. Additionally, developers can leverage these insights to quickly understand legacy system functionality without deep mainframe expertise.

VMware

Open all

AWS Transform for VMware provides three key advantages. First, AWS Transform orchestrates your entire migration journey, boosting team productivity. Second, it automates complex and labor- intensive migration tasks, including wave planning and network conversion. This simplification accelerates migrations, reduces errors, and minimizes the need for in-house expertise, fast- tracking your time to value. Finally, AWS Transform customizes your migration journey by understanding your specific migration goals and analyzing your source environment.

Yes, AWS Transform for VMware is designed to migrate your complex VMware workloads and multi-tier applications. Its technology identifies intricate application dependencies and relationships, even in large, complex environments. It then groups related servers into logical application groups that need to be migrated as a single migration wave. For instance, when migrating a 500 VM environment, AWS Transform might identify that 50 VMs need to be migrated as a single unit due to tight coupling. This capability is particularly valuable for customers with interconnected legacy systems or microservices architectures. You can define flexible business rules to group your applications and automatically generate migration waves that suit your needs.

To get started, sign in to the AWS Transform web application using your current enterprise credentials. If you are a new user, your account administrator must first enable AWS Transform and add you as a user through AWS IAM Identity Center for single sign-on (SSO) access. For VMware migrations, AWS Transform will guide youto upload asset inventory from third-party tools, or from the AWS Transform discovery tool. You can start a new project in AWS Transform by specifying your goals.

AWS Transform for VMware is the first agentic AI–powered assistant for large-scale migration of VMware workloads to Amazon Elastic Compute Cloud (Amazon EC2). It simplifies and accelerates your migration by allowing you to specify goals, generate plans to meet those goals, and conduct approved actions on your behalf.

AWS Transform for VMware streamlines the entire migration journey by analyzing your environment, builds an understanding of your application inventory and dependencies, and proposes logical application groups for migration waves using server and network data. It orchestrates dependency-aware migrations to minimize downtime, recommends right-sized Amazon EC2 instances, and allows for seamless collaboration across teams.

AWS Transform builds the job plan dynamically to meet your specific needs.

AWS Transform supports the following capabilities:

  • Discovery: Perform discovery of your on-premises environment
  • Planning: Generate a wave plan to suit your business needs
  • Network: Migration: Configures and generates IaC for deployment. AWS Transform can also automate the deployment
  • Rehost: Migrate servers to EC2

AWS Transform will tailor your job plans based on your goals, including any combination of:

  • End-to-end migration: Performs discovery, generates wave plans, configures VPC networks, and migrates servers.
  • Network migration only: Focuses solely on generating and deploying VPC configurations.
  • Network-and-server migration: Configures and deploys VPC networks, then migrates servers without discovery.
  • Discovery and server migration: Performs discovery, generates wave plans, and migrates servers without network configuration.

Yes, AWS Transform analyzes the configuration and utilization data of your source VMs to recommend appropriate EC2 instance types for your migrated workloads. It considers factors like CPU, memory, storage, and network requirements to suggest cost-effective and performance-optimized instances. You can review and adjust these recommendations before migration.

AWS Transform for VMware helps you discover source servers using multiple data collection methods. It plans your migration to AWS using the configuration data collected about your source servers and databases, applying machine learning (ML) techniques to plan your migration waves. It supports several ways of performing discovery and collecting data about your source servers.

The AWS Transform discovery tool can conduct a centralized discovery by deploying the Discovery Collector (OVA file) through your VMware vCenter. The discovery tool can discover VM configuration for business case generation and migration planning, resource utilization for right sizing recommendations, and database metadata and server-to-server connections for application dependency mapping to enable wave plans creation. In addition, you can also use RVTools exports to provide CSV or Excel format exports that contain detailed information about your VMware environment including vSwitches, port groups, and VLANs. You can export discovery data from select third-party tools (including Cloudamize, Matilda Cloud, and ModelizeIT) to be used in AWS Transform for migration planning.

AWS Transform now supports migration of networks and applications to multiple target accounts and conversion of network configuration from additional data sources (Cisco ACI, Palo Alto, and Fortigate), and managing rehost transitions at the wave and at the server level.

Currently, AWS Transform only supports migrating source VMware environments to Amazon EC2. While AWS Transform does not support automated migration of source VMware environments to Amazon Elastic VMware Service (Amazon EVS), it understands your migration goals and provides guidance on migrating to Amazon EVS by using VMware Hybrid Cloud Extension (HCX) for your use case.

AWS Transform for VMware implements comprehensive encryption for your data both in transit and at rest:

Data in transit:

  • All communications between your environment, AWS Transform for VMware, and AWS services use Transport Layer Security (TLS) 1.2 or higher encryption.
  • Data replication from your source servers to AWS utilizes encrypted connections for secure transfer.
  • API calls between AWS services involved in your migration are automatically encrypted as part of AWS standard security practices.

Data at rest:

  • By default, AWS Transform for VMware encrypts data stored in Amazon S3 buckets using AWS managed encryption keys.
  • You have the option to use your own customer-managed AWS KMS keys for enhanced control and security over the encryption process.
  • Replicated server data stored during migration is encrypted according to AWS Application Migration Service standard encryption practices.
  • Metadata and configuration information stored by AWS Transform for VMware is encrypted using AWS standard encryption mechanisms.

This comprehensive encryption approach helps ensure your migration data remains protected throughout the entire migration process, aligning with security best practices and helping you meet compliance requirements for data protection.

Important Note: AWS Transform creates Amazon S3 buckets on your behalf in your target AWS accounts. These buckets do not have SecureTransport enabled by default. If you want the bucket policy to include SecureTransport, you must update the policy yourself. For more information, see Security best practices for Amazon S3.

Yes, AWS Transform for VMware lets you avoid using the public internet for data replication. You can establish private connectivity using AWS Direct Connect for a dedicated, high-bandwidth link or an AWS Site-to-Site VPN for an encrypted tunnel between your data center and AWS. These options keep migration traffic secure and off the public internet while improving performance with more predictable network conditions. When setting up replication, you can configure AWS Transform to use your private connection, making it ideal for large-scale migrations with sensitive or high-volume data.

AWS Transform for VMware stores your migration data in several places:

  • Your AWS accounts: AWS Transform creates S3 buckets in your target accounts to store your migration data, artifacts, and configuration information. You maintain full control over these buckets and can choose the encryption keys used.
  • AWS Transform workspace: Your data is processed in the AWS Region where you created your AWS Transform workspace to generate migration recommendations and plans.
  • Temporary service storage: For certain migration jobs, customer data is securely and temporarily uploaded to an artifact store in the AWS service account in the same region as your source account. This data is used for processing and is automatically deleted if the job or account is deleted.
  • Service metrics storage: Calculated migration metrics and assessment results are stored in AWS service accounts in S3 and CloudWatch for service improvement and operational monitoring.
  • Replication data: Stored in EBS snapshots and volumes in your target AWS account.

While AWS Transform creates S3 buckets with basic security configurations including encryption at rest, we strongly recommend implementing additional S3 bucket security best practices to fully protect your data, such enforcing encryption in transit, enabling access logging, and implementing appropriate bucket policies.

AWS Transform for VMware operates in two distinct ways when it comes to Regional availability:

Workspace Regions: These Regions host the AI workspaces where discovery data is processed, assessments are conducted, wave planning occurs, and right-sizing recommendations are generated. Currently, workspace Regions include:

  • US East (N. Virginia)
  • Asia Pacific (Mumbai)
  • Asia Pacific (Seoul)
  • Asia Pacific (Sydney)
  • Asia Pacific (Tokyo)
  • Canada (Central)
  • Europe (Frankfurt)
  • Europe (London)

You should choose your workspace Region based on compliance requirements for data processing. For example, European customers with data residency requirements should select Europe (Frankfurt) to ensure their configuration data remains within Europe during analysis.

Target migration Regions: AWS Transform for VMware supports migration to these target Regions.

Under the shared responsibility model, you are responsible for selecting the appropriate Regions that meet your data residency and compliance requirements. If you choose a target Region that differs from your workspace Region, be aware that data will be transferred across AWS Regions during the migration process, and you'll need to evaluate this against your data governance policies.

For the most up-to-date information on supported Regions, refer to AWS Services by Region.

Custom

Open all

AWS Transform custom uses agentic AI to perform large-scale custom modernization of software, code, libraries, and frameworks to reduce technical debt. It handles diverse scenarios including version upgrades (Java 8 to 17, Python 3.9 to 3.13), runtime migrations (x86 to Graviton), framework upgrades and transitions (Spring Boot upgrades, Angular to React), refactoring (observability instrumentation), and organization-specific transformations.

The service includes ready-to-use, AWS-managed transformations for common use cases, such as Java, Node.js, Python upgrades and AWS SDK upgrades. For other scenarios and organization-specific needs, you can create custom transformations through natural language interactions, documentation, and code samples. Through continual learning, the agent improves from every execution and developer feedback, delivering high-quality, repeatable transformations without requiring specialized automation expertise.

The custom transformation agent has four core components:

  1. Natural language-driven transformation definition: Allows teams to generate organization-specific transformations using natural language interactions, documentation, and code samples. The AI agent generates an initial transformation definition that can be iteratively refined through chat, additional examples, or direct edits.
  2. Transformation execution across codebases: Applies transformation definitions reliably and consistently across multiple codebases. AWS Transform custom uses configurable build commands to build and verify the transformed code. Using the AWS Transform web applications, you can set up a large-scale campaign to transform multiple codebases and track its progress.
  3. Continual learning: Automatically captures feedback and improves over time from every execution to enhance transformation accuracy and effectiveness. AWS Transform custom analyzes all execution data and automatically generates improved versions of transformation definitions, ensuring each subsequent transformation becomes more reliable and efficient.
  4. AWS-managed transformations: Provides ready-to-use, AWS-managed transformations for common upgrade scenarios like Java, Python, and Node.js version upgrades. These transformations are vetted by AWS to be high quality and are ready to use without any additional setup.

AWS Transform custom adapts to your workflow. For large-scale modernization projects, you can apply repeatable transformations across multiple codebases following these phases:

Phase 1: Define transformation (optional): For custom transformations, provide natural language prompts, reference documents, and code samples to the AI agent, which generates an initial transformation definition. You can iteratively refine the definition through chat, additional examples, or direct edits, then test and verify the transformation on sample codebases before publishing it for use across the organization. For AWS-managed transformations, you can skip this phase and use ready-made transformations.

Phase 2: Perform pilot or proof-of-concept: Validate that the transformation produces the expected results by performing a pilot on a subset of the target code. This phase is sometimes combined with the validation of the transformation definition in the case of custom transformations. Pilots can also be used to estimate the cost in terms of time and AWS Transform custom usage of the transformations.

Phase 3: Scaled execution: After the pilot, the transformation is tweaked based on pilot results. Note that AWS Transform custom continual learning will have improved the quality during the pilot. In scaled execution, teams can set up automated bulk executions where AWS Transform CLI executes transformations in batches and creates resulting code to be reviewed by individual teams, or teams can execute the CLI directly for full control, which is sometimes preferable for more complex transformations.

Phase 4: Monitor and review: Concurrently to scaled executions, you can monitor the execution progress and review and approve learnings extracted by AWS Transform custom continual learning.

AWS Transform custom can be accessed through two interfaces:

AWS Transform CLI (Command Line Interface)

The CLI is used to create new custom transformations interactively and execute transformations on local codebases either interactively or autonomously. It is deployed as a simple, scriptable CLI that can be integrated with any source control system or deployment pipeline. The CLI is intentionally minimal and composable, and can be run on individual developer machines, in a container, or as part of your organization's greater modernization framework.

AWS Transform Web application (optional)

The AWS Transform web application is used to start and monitor large-scale transformation projects across multiple repositories. It allows you to select the transformation you want to execute at scale and track real-time progress updates on transformation execution.

AWS-managed transformations are pre-built, AWS-vetted transformations for common upgrade scenarios that are ready to use without any additional setup:

Currently available:

  • Java 8 to 17 migrations (for both Gradle and Maven)
  • Node.js 12 to 22 upgrades (including Lambda environments)
  • Python runtime updates to 3.11/3.12/3.13 (standard and Lambda)
  • AWS SDK migrations (v1 to v2)

Key characteristics:

  • Validated by AWS: These transformations are vetted by AWS to be high quality
  • Ready to use: No additional setup required
  • Continuously growing: Additional out-of-the-box transformations are continually being added
  • Customizable: Pre-built transformations can be customized by providing additional guidance or requirements specific to your organization's needs (e.g., the Java upgrade transformation can be enhanced with specific rules for handling your internal libraries or coding standards)
  • Experimental support: Some transformations may be marked as experimental as they undergo further testing and refinement

AWS-managed out-of-the-box transformations enable you to get started quickly with common modernization patterns while leveraging AWS's expertise, and then customize them for your organization's specific requirements.

The time to create a transformation varies based on complexity and available data in the form of existing migration guides, documentation, and code examples. The more information provided, the better the initial quality of the transformation. For common upgrades, migrations, and refactoring, initial transformation definition takes 1-2 days, and testing and refinement on sample codebases requires 2-3 days of iteration.

The transformation can be refined by executing it interactively, pausing and providing natural language feedback if required, or providing feedback at the end. The feedback can be natural language, code fixes, or additional before-and-after samples. AWS Transform custom provides guidance on how to improve the transformation quality. It is important to remember that you may need to simplify the transformation, such as decomposing it into multiple steps, to obtain good results. Once working reliably, the transformation can be published for organization-wide use.

AWS Transform custom implements multiple safety measures to ensure transformation quality. It incorporates Amazon Bedrock safety guardrails and breaks code changes down to reasonably-sized chunks for easier review. Transformations use user-defined build and test commands to validate changes and can specify validation criteria that must be met, such as successful test execution or specific code patterns that must be maintained.

When a transformation encounters errors, AWS Transform custom provides detailed logs of what went wrong and where. For build or test failures, it captures the specific error messages and context. If incorrect code is generated, you can provide feedback, which the agent incorporates into its learning system to improve future transformations. Failed transformations can be retried with additional context or broken down into smaller, more manageable changes.

The continual learning system gathers information from each transformation execution through both explicit feedback (comments and code fixes) and implicit observations the agent encounters while transforming and debugging code. This information is processed to create "knowledge items" that improve future transformations. These knowledge items are specific to that transformation and are not shared across different transformations or different customers. The items can be reviewed and managed by transformation owners, who can enable or disable specific learnings. The learning process occurs automatically after transformations are completed, requiring no additional user input.

AWS Transform CLI can be easily embedded in CI/CD pipelines and run directly in your own build infrastructure. This allows you to integrate transformations into your existing development and deployment workflows, enabling automated execution as part of your standard processes.

You need an AWS account and IAM permissions to run the AWS Transform CLI. Access to AWS Transform web application requires AWS IAM Identity Center, but it is not required to access the functionalities in the CLI.

Privacy

Open all

We may use certain Content from AWS Transform for service improvement. AWS Transform may use this content, for example, to provide better responses to common questions, fix AWS Transform operational issues, for de-bugging, or for model training.

You can opt out of service improvement at any time through your service settings. For the AWS Transform web console experience, opt out by configuring an AI services opt-out policy in AWS Organizations. For more information, see AI services opt-out policies in the AWS Organizations User Guide. For the IDE, adjust your settings in the IDE to opt out.

Your trust, your privacy, and the security of your data are our highest priority. We implement appropriate and sophisticated technical and physical controls, including encryption at rest and in transit. This is designed to prevent unauthorized access to, or disclosure of, your data and ensure that our use complies with our commitments to you. See Data Privacy FAQs for more information.

When you provide code that you own into AWS Transform, you retain ownership in the ported version of your code. Once the porting is completed, you can review the output and either modify it prior to deploying into production or use it as-is.

Unless explicitly opted out, content from AWS Transform might also be used for enhancing or improving the quality of Foundation Models (FMs). This data will not be shared with other third-party model providers. Your content will not be used if you use the opt-out mechanism described in the documentation. For more information, see Sharing your data with AWS.