AWS Database Blog

Category: Advanced (300)

Cross-account migration of Amazon RDS for SQL Server with column-level encryption

Organizations running SQL Server workloads on Amazon RDS sometimes need to migrate their databases to different AWS accounts. This migration becomes more complex when mission-critical data requires column-level encryption to meet compliance requirements. In this post, we demonstrate how you can migrate your symmetric key-encrypted database on Amazon RDS for SQL Server to another AWS account without compromising security. The solution we present can also help you implement symmetric key encryption on a new database in Amazon RDS for SQL Server.

Amazon DynamoDB zero-ETL integration with Amazon SageMaker Lakehouse – Part 2

Amazon DynamoDB zero-ETL integration with Amazon SageMaker Lakehouse allows you to run analytics workloads on your DynamoDB data without having to set up and manage extract, transform, and load (ETL) pipelines. In this post we cover setting up Amazon SageMaker Unified Studio, followed by running data analysis to showcase its capabilities. We illustrate our solution walkthrough with an example of a credit card company that wants to analyze its customer behavior and spending trends.

Amazon DynamoDB zero-ETL integration with Amazon SageMaker Lakehouse – Part 1

Amazon DynamoDB zero-ETL integration with Amazon SageMaker Lakehouse allows you to run analytics workloads on your DynamoDB data without having to set up and manage extract, transform, and load (ETL) pipelines. In this two-part series, we first walk through the prerequisites and initial setup for the zero-ETL integration. In Part 2, we cover setting up Amazon SageMaker Unified Studio, followed by running data analysis to showcase its capabilities. We illustrate our solution walkthrough with an example of a credit card company that wants to analyze its customer behavior and spending trends.

Streamline Amazon Aurora database operations at scale: Introducing the AWS Database Acceleration Toolkit

In this post, we introduce the AWS Database Acceleration Toolkit (DAT), an open source database accelerator. DAT is an infrastructure as code solution using Terraform to simplify and automate initial setup, provisioning, and on-going maintenance activities for Amazon Aurora.

Using the PostgreSQL extension tds_fdw to validate data migration from SQL Server to Amazon Aurora PostgreSQL

Data validation is an important process during data migrations, helping to verify that the migrated data matches the source data. In this post, we present alternatives you can use for data validation when dealing with tables that lack primary keys. We discuss alternative approaches, best practices, and potential solutions to make sure that your data migration process remains thorough and reliable, even in the absence of traditional primary key-based validation methods. Specifically, we demonstrate how to perform data validation after a full load migration from SQL Server to Amazon Aurora PostgreSQL-Compatible Edition using the PostgreSQL tds_fdw extension.

Real-time Iceberg ingestion with AWS DMS

Etleap is an AWS Advanced Technology Partner with the AWS Data & Analytics Competency and Amazon Redshift Service Ready designation. In this post, we show how Etleap helps you build scalable, near real-time pipelines that stream data from operational SQL databases into Iceberg tables using AWS DMS. You can use AWS DMS as a robust and configurable solution for change data capture (CDC) from all major databases into AWS.

Migrate Google Cloud SQL for PostgreSQL to Amazon RDS and Amazon Aurora using pglogical

In this post, we provide the steps to migrate a PostgreSQL database from Google Cloud SQL to RDS for PostgreSQL and Aurora PostgreSQL using the pglogical extension. We also demonstrate the necessary connection attributes required to support the database migration. The pglogical extension works for the community PostgreSQL version 9.4 and higher, and is supported on RDS for PostgreSQL and Aurora PostgreSQL as of version 12+.

Streamline code conversion and testing from Microsoft SQL Server and Oracle to PostgreSQL with Amazon Bedrock

Organizations are increasingly seeking to modernize their database infrastructure by migrating from legacy database engines such as Microsoft SQL Server and Oracle to more cost-effective and scalable open source alternatives such as PostgreSQL. This transition not only reduces licensing costs but also unlocks the flexibility and innovation offered by PostgreSQL’s rich feature set. In this post, we demonstrate how to convert and test database code from Microsoft SQL Server and Oracle to PostgreSQL using the generative AI capabilities of Amazon Bedrock.

Build a multi-Region session store with Amazon ElastiCache for Valkey Global Datastore

As companies expand globally, they must be able to architect highly available and fault-tolerant systems across multiple AWS Regions. With such scale, a company can find itself in this position when designing a caching solution across its multi-Region infrastructure. In this post, we dive deep into how to use Amazon ElastiCache for Valkey, a fully managed in-memory data store with Redis OSS and Valkey compatibility, and the Amazon ElastiCache for Valkey Global Datastore feature set.

Automate Amazon RDS for PostgreSQL major or minor version upgrade using AWS Systems Manager and Amazon EC2

In this post, we guide you through setting up automation for pre-upgrade checks and upgrading a fleet of Amazon RDS for PostgreSQL instances. In this solution, we use AWS Systems Manager to automate the Amazon RDS upgrade job.