AWS HPC Blog

Category: AWS Batch

Evaluating next‑generation cloud compute for large‑scale genomic processing

AstraZeneca’s genomic research requires extensive computational resources to analyze DNA sequences for developing life-saving therapies. As cloud infrastructure evolves with more powerful capabilities, customers can adopt them to see performance and efficiency gains. AstraZeneca successfully migrated to Amazon EC2 F2 instances for genomics, boosting performance by 60% and slashing costs by 70%.

Optimize Nextflow Workflows on AWS Batch with Mountpoint for Amazon S3

Are you running genomic workflows with Nextflow on AWS Batch and experiencing bottlenecks when staging large reference files? In this post, we will show you how to optimize your workflow performance by leveraging Mountpoint for Amazon S3 to stream reference data directly into your Nextflow processes, eliminating the need to stage large static files repeatedly.

Running NVIDIA Cosmos world foundation models on AWS

Running NVIDIA Cosmos world foundation models on AWS provides powerful physical AI capabilities at scale. This blog covers two production-ready architectures, each optimized for different organizational needs and constraints.

AWS at SC25 - Meet the Advanced Computing team at Booth #2207

Meet the Advanced Computing team of AWS at SC25 in St. Louis

We want to empower every scientist and engineer to solve hard problems by giving them access to the compute and analytical tools they need, when they need them. Cloud HPC can be a real human progress catalyst. If you run large scale simulations, tune complex models, or support researchers who consistently need more compute, the […]

AWS re:Invent 2025: Your Complete Guide to High Performance Computing Sessions

AWS re:Invent 2025 returns to Las Vegas, Nevada on December 1, uniting AWS builders, customers, partners, and IT professionals from across the globe. This year’s event offers you exclusive access to compelling customer stories and insights from AWS leadership as they tackle today’s most critical challenges in high-performance computing, from accelerating scientific discovery to optimizing […]

Introducing “default” instance categories for AWS Batch

Today, we are launching a new set of instance family categories for AWS Batch, “default_x86_64” and “default_arm64″. These new categories represent both a clarification and an improvement upon the existing “optimal” instance type category. This blog post gives some background on the new feature and how you can configure your Batch environments to take advantage […]

AI-Enhanced Subsurface Infrastructure Mapping on AWS

Subsurface infrastructure mapping is crucial for industries ranging from oil and gas to environmental protection. Our groundbreaking approach combines advanced magnetic imaging with physics-informed AI to provide unparalleled visibility into hidden structures, even when traditional methods fall short. Explore how this fusion of cloud computing and AI is opening new possibilities for subsurface exploration and management.