AWS Public Sector Blog
Category: Amazon EventBridge
Deep dive into FedRAMP 20x Key Security Indicators: Decoding the 63 KSIs
In this post, we break down every KSI theme, categorize each indicator by validation approach, and provide a practical gap analysis framework so you can begin preparing your cloud service offering (CSO) for FedRAMP 20x authorization on Amazon Web Services (AWS).
How The Coupon Bureau built a real-time universal digital coupon platform on AWS
The Coupon Bureau (TCB) is modernizing how coupons work in the US with the first universal digital coupon standard: AI (8112). To make this possible, TCB needed to build a system that could support billions of coupon transactions in real time, integrate seamlessly with hundreds of providers and retailers, and deliver mission-critical reliability. By using AWS services, TCB designed a cloud-based architecture that processes high-volume events, delivers real-time webhook notifications, provides data recovery during outages, and scales seamlessly as adoption grows.
Building machine learning operations framework with Amazon SageMaker: Technical Safety BC’s Journey
Technical Safety BC (TSBC) regulates the safe installation and operation of technical systems (electrical, gas, boiler, elevator, etc.) in British Columbia. This post showcases how the TSBC built a machine learning operations (MLOps) solution using AWS to streamline production model training and management to process public safety inquiries more efficiently.
AWS Professional Services collaborates with NOAA and GAMA-1 Technologies to expand NESDIS Common Cloud Framework capabilities
AWS Professional Services has partnered with the National Environmental Satellite, Data, and Information Service (NESDIS), part of the National Oceanic and Atmospheric Administration (NOAA), alongside GAMA-1 Technologies to broaden the scope and capabilities of the NESDIS Common Cloud Framework (NCCF). This work supports NOAA’s ability to ingest, manage, process, and disseminate critical environmental data. Read this post to learn more.
How NIH scientists unlocked cardiovascular disease insights using AWS
Scientists at the National Institutes of Health (NIH) recently uncovered how a structure known as low-density lipoprotein (LDL), which transports “bad” cholesterol through the bloodstream, interacts with its receptor molecule to enter cells—information that has eluded researchers for decades. The findings could lead to more personalized treatments for cardiovascular disease and were enabled by cutting-edge high performance computing (HPC) infrastructure from AWS. Read this post to learn more.
Leverage generative AI for biocuration using Amazon Bedrock and Amazon Nova foundation models
Personalized therapy for diseases such as cancer utilizes an individual’s unique genomic profile to guide treatment decisions. However, the effect and clinical significance of most genetic variants are uncertain. Accurate classification of the clinical significance of novel genetic variants requires extensive curation of peer-reviewed biomedical literature. In recent years, generative AI has demonstrated promising results in information extraction and text summarization. In this post, we explore how various AWS-native solutions can be used to create a secure, retrieval-augmented, and cost-effective biomedical chatbot designed to facilitate biocuration.
How Fair Trade USA uses AWS to improve working conditions for farmers
Fair Trade USA™ is a nonprofit organization that is committed to eliminating poverty by promoting sustainable development through ethical trade. They work to ensure fair compensation, safe working conditions for farmers and workers, and sustainable farming practices. In this post, you’ll learn how Fair Trade USA leverages Amazon Web Services (AWS) to improve working conditions for farmers and producers around the world.
Amazon EC2 Spot Instances for scientific workflows: Using generative AI to assess availability
In recent years, public sector organizations have found success running their scientific data processing workloads on Amazon Web Services. As the number of workloads increase with the massive data volume and complex scientific simulations, organizations are looking for ways to optimize cost while maintaining research momentum. Amazon EC2 Spot Instances presents a compelling option to run unused Amazon Elastic Compute Cloud (Amazon EC2) capacity with an up to 90 percent discount compared to On-Demand prices. However, the intermittent nature of Spot Instances often requires careful consideration, especially when handling time-sensitive mission-critical workloads. In this post, we discuss how organizations can effectively identify opportunities to use Spot Instances and Amazon Q Business to develop an enhanced Spot Instance analysis.
Test and integrate ground segment with AWS Ground Station digital twin
Amazon Web Services (AWS) customers building software-defined ground segment solutions with the AWS Ground Station now have more confidence in their solution: they can integrate their DevOps practices with AWS Ground Station’s digital twin feature, which became generally available in August. The digital twin is useful for both aspiring and existing AWS Ground Station customers to achieve faster outcomes without applying for satellite licensing and more cost-effectively than scheduling a production satellite contact. Read this post to learn more.
Securely running AI algorithms for 100,000 users on private data
This post explores the architectural design and security concepts employed by Radboud University Medical Center Nijmegen (Radboudumc) to build a secure artificial intelligence (AI) runtime environment on Amazon Web Services (AWS). Business leaders dealing with sensitive or regulated data will find this post invaluable because it demonstrates a proven approach to using the power of AI while maintaining strict data privacy and security standards.









