AWS Database Blog
Cross-account migration of Amazon RDS for SQL Server with column-level encryption
Organizations running SQL Server workloads on Amazon RDS sometimes need to migrate their databases to different AWS accounts. This migration becomes more complex when mission-critical data requires column-level encryption to meet compliance requirements. In this post, we demonstrate how you can migrate your symmetric key-encrypted database on Amazon RDS for SQL Server to another AWS account without compromising security. The solution we present can also help you implement symmetric key encryption on a new database in Amazon RDS for SQL Server.
Amazon DynamoDB zero-ETL integration with Amazon SageMaker Lakehouse – Part 2
Amazon DynamoDB zero-ETL integration with Amazon SageMaker Lakehouse allows you to run analytics workloads on your DynamoDB data without having to set up and manage extract, transform, and load (ETL) pipelines. In this post we cover setting up Amazon SageMaker Unified Studio, followed by running data analysis to showcase its capabilities. We illustrate our solution walkthrough with an example of a credit card company that wants to analyze its customer behavior and spending trends.
Amazon DynamoDB zero-ETL integration with Amazon SageMaker Lakehouse – Part 1
Amazon DynamoDB zero-ETL integration with Amazon SageMaker Lakehouse allows you to run analytics workloads on your DynamoDB data without having to set up and manage extract, transform, and load (ETL) pipelines. In this two-part series, we first walk through the prerequisites and initial setup for the zero-ETL integration. In Part 2, we cover setting up Amazon SageMaker Unified Studio, followed by running data analysis to showcase its capabilities. We illustrate our solution walkthrough with an example of a credit card company that wants to analyze its customer behavior and spending trends.
Streamline Amazon Aurora database operations at scale: Introducing the AWS Database Acceleration Toolkit
In this post, we introduce the AWS Database Acceleration Toolkit (DAT), an open source database accelerator. DAT is an infrastructure as code solution using Terraform to simplify and automate initial setup, provisioning, and on-going maintenance activities for Amazon Aurora.
Using the PostgreSQL extension tds_fdw to validate data migration from SQL Server to Amazon Aurora PostgreSQL
Data validation is an important process during data migrations, helping to verify that the migrated data matches the source data. In this post, we present alternatives you can use for data validation when dealing with tables that lack primary keys. We discuss alternative approaches, best practices, and potential solutions to make sure that your data migration process remains thorough and reliable, even in the absence of traditional primary key-based validation methods. Specifically, we demonstrate how to perform data validation after a full load migration from SQL Server to Amazon Aurora PostgreSQL-Compatible Edition using the PostgreSQL tds_fdw extension.
Real-time Iceberg ingestion with AWS DMS
Etleap is an AWS Advanced Technology Partner with the AWS Data & Analytics Competency and Amazon Redshift Service Ready designation. In this post, we show how Etleap helps you build scalable, near real-time pipelines that stream data from operational SQL databases into Iceberg tables using AWS DMS. You can use AWS DMS as a robust and configurable solution for change data capture (CDC) from all major databases into AWS.
Migrate Google Cloud SQL for PostgreSQL to Amazon RDS and Amazon Aurora using pglogical
In this post, we provide the steps to migrate a PostgreSQL database from Google Cloud SQL to RDS for PostgreSQL and Aurora PostgreSQL using the pglogical extension. We also demonstrate the necessary connection attributes required to support the database migration. The pglogical extension works for the community PostgreSQL version 9.4 and higher, and is supported on RDS for PostgreSQL and Aurora PostgreSQL as of version 12+.
Upgrade your Amazon DynamoDB global tables to the current version
Amazon DynamoDB is a fully managed, serverless NoSQL database that delivers single-digit millisecond performance for applications at any scale. DynamoDB global tables is a multi-active database feature that replicates data across AWS Regions, enabling local reads and writes. In this post, we explain why we strongly recommend all customers use the Current version for all global tables.
Streamline code conversion and testing from Microsoft SQL Server and Oracle to PostgreSQL with Amazon Bedrock
Organizations are increasingly seeking to modernize their database infrastructure by migrating from legacy database engines such as Microsoft SQL Server and Oracle to more cost-effective and scalable open source alternatives such as PostgreSQL. This transition not only reduces licensing costs but also unlocks the flexibility and innovation offered by PostgreSQL’s rich feature set. In this post, we demonstrate how to convert and test database code from Microsoft SQL Server and Oracle to PostgreSQL using the generative AI capabilities of Amazon Bedrock.
Implement prescription validation using Amazon Bedrock and Amazon DynamoDB
Healthcare providers manage an ever-growing volume of patient data and medication information to help ensure safe, effective treatment. Although traditional database systems excel at storing patient records, they require complex queries to access information. By adding generative AI capabilities, healthcare providers can now use natural language to search patient records and verify medication safety, rather than writing complex database queries. In this post, I show you a solution that uses Amazon Bedrock and Amazon DynamoDB to create an AI agent that helps healthcare providers quickly identify potential drug interactions by validating new prescriptions against a patient’s current medication records.