AWS Public Sector Blog
Tag: technical how-to
How public authorities can improve the freedom of information request process using Amazon Bedrock
Many public sector agencies consist of multiple departments, each with their own functions. This can introduce administrative delays when processing incoming requests due to challenges such as needing to manually route the ever-growing volume of paperwork to the correct destination. This blog explores how Amazon Bedrock can be used to address these challenges by classifying documents based on their key topics and appropriately distributing them. In particular, we will focus on improving the efficiency of the freedom of information (FOI) request process, but this solution can be applied to various public sector use cases.
No-code AI development: Using Amazon SageMaker AI and Amazon Kendra for smart search chatbots
In this post, we walk through creating a Retrieval Augmented Generation (RAG)–powered chat assistant using Amazon SageMaker AI and Amazon Kendra to query donor data on AWS.
Detect and investigate Amazon EC2 malware with Amazon GuardDuty and Amazon Detective
In this post, we demonstrate how to use the advanced malware detection features of Amazon GuardDuty to uncover malicious and suspicious files compromising your Amazon Elastic Compute Cloud (Amazon EC2) instances. We use the investigative capabilities of Amazon Detective to gain deeper insights into the security event. After the key questions about the security event are addressed, we outline steps to remediate the potentially compromised EC2 instance.
Build a secure AWS foundation in under 60 minutes: A guide for public sector organizations
In this blog, we will guide you through the process of setting up a secure multi-account AWS environment using AWS Control Tower, AWS IAM Identity Center, AWS Organizations and will show you how to secure your environment using AWS Config, AWS Security Hub, and Amazon GuardDuty.
Customizing isolated JupyterLab environments in Amazon SageMaker Studio
This post demonstrates how to enhance security and compliance in an isolated SageMaker JupyterLab environment by implementing two key customizations: configuring download restrictions and implementing secure Python package installation through AWS CodeArtifact.
Efficient large-scale serverless data processing for slow downstream systems
This post demonstrates how to build serverless workflows on AWS that process such data using AWS Step Functions and integrate with downstream systems that have concurrency limitations.
Chaos engineering made clear: Generate AWS FIS experiments using natural language through Amazon Bedrock
In this post, we demonstrate how to use the generative AI capabilities of Amazon Bedrock to streamline the creation of AWS Fault Injection Service (FIS) experiments.
Integrate AI-powered coding assistance in secure environments using Continue and Amazon Bedrock
Organizations adopting modern software development activities continue to embrace the advantages of AI and large language models (LLMs), maximizing the productivity of developers. Amazon Q Developer provides you with an AI coding companion that delivers direct access for developers to the AI companion within the integrated development environment (IDE). In this post, we walk you through an example you can use leveraging the power of Amazon Bedrock to provide a coding assistant in your IDE.
Get started quickly with Wickr Enterprise Embedded Cluster
In this blog, we provide deployment guidance that is intended to help you get started quickly with Wickr Enterprise for capability testing.
How to use data from the AWS Open Data program in Amazon Bedrock
Many government agencies, like the National Oceanic and Atmospheric Administration (NOAA), participate in the AWS Open Data Sponsorship Program. In this post, we discuss how to use NOAA datasets in the Registry of Open Data on AWS using Amazon Bedrock Knowledge Bases.