AWS Database Blog
Assess and convert Teradata database objects to Amazon Redshift using the AWS Schema Conversion Tool CLI
AWS Schema Conversion Tool (AWS SCT) makes self-managed data warehouse migrations predictable by assessing and converting the source database schema and code objects to a format compatible with Amazon Redshift. In this post, we describe how to perform a database assessment and conversion from Teradata to Amazon Redshift. To accomplish this, we use the AWS SCT and its CLI, because it provides support for Teradata as a source database, complementing the wide range of assessments handled by AWS Database Migration Service (AWS DMS) Schema Conversion (DMS SC).
Multi-AZ deployment for Amazon RDS Custom for Oracle
In this post, we explore the benefits and features of Multi-AZ for RDS Custom for Oracle and how it helps improve the resilience of your database.
Volatility classification in PostgreSQL
In this post, we discuss different ways you can use volatility classification with functions in PostgreSQL and provide best practices to help you keep your database optimized and develop efficient and reliable database applications.
Configure Amazon RDS for Db2 standby replicas for high availability and faster disaster recovery
In this post, we demonstrate how to configure a standby replica for your RDS for Db2 instance. We also discuss best practices for setting up, monitoring, and managing standby replicas.
Amazon Aurora DSQL for gaming use cases
In this post, we show you how Amazon Aurora DSQL powers modern gaming use cases from real-time multiplayer interactions to globally consistent leaderboards by delivering seamless scalability, strong consistency and built-in multi-region availability.
Evolve your Amazon DynamoDB table’s data model
In this post, we show you how to evolve your DynamoDB table’s data model to meet changing application requirements while maintaining zero downtime in production systems. We explore two main techniques with examples that you can apply to your own applications: Adding new attributes and Creating new entities.
Transform uncompressed Amazon DocumentDB data into compressed collections using AWS DMS
In this post, we discuss handling large collections that are approaching 32 TiB for Amazon DocumentDB. We demonstrate solutions for transitioning from uncompressed to compressed collections using AWS DMS. This migration not only accommodates larger uncompressed data volumes, but also significantly reduces storage, compute costs associated with Amazon DocumentDB and improves performance.
Introducing Amazon Keyspaces CDC streams
Last week, AWS announced Amazon Keyspaces change data capture (CDC) streams, a new feature that captures real-time data changes in your Amazon Keyspaces tables. In this post, we discuss the architecture of Amazon Keyspaces CDC streams, explore its use cases and benefits, and provide an example demonstrating how to set up CDC streams, stream data, and capture the streamed records.
How Aqua Security automates fast clone orchestration on Amazon Aurora at scale
Aqua Security is a leading provider of cloud-based security solutions, trusted by global enterprises to secure their applications from development to production. In this post, we explore how Aqua Security automates the use of Amazon Aurora fast clones to support read-heavy operations at scale, simplify their data workflows, and maintain operational efficiency.
How TalentNeuron optimized data operations and cut costs and modernized with Amazon Aurora I/O-Optimized
For years, TalentNeuron, a leader in talent intelligence and workforce planning, has been empowering organizations with data-driven insights by collecting and processing vast amounts of job board data. In this post, we share three key benefits that TalentNeuron realized by using Amazon Aurora I/O-Optimized as part of their new data platform: reduced monthly database costs by 29%, improved data validation performance, and accelerated innovation through modernization.