Migration & Modernization
Enabling AI agents to interact with existing applications
When I talk to organizations looking to adopt AI-driven workflows, one question keeps coming up: how can we take the applications we’ve already invested in and make them usable by AI agents? The answer lies in a pattern that bridges modern AI with traditional business systems by exposing your existing applications as tools via the Model Context Protocol (MCP) to unlock new potential with minimal effort. MCP is a universal standard that enables AI models to seamlessly interact with external tools, data sources, and applications, effectively serving as a bridge between AI systems and the broader digital ecosystem.
In this blog, I will walk through how to modernize existing REST APIs with a 4-step process and make their capabilities available to an AI agent using MCP and AWS-native tools like Amazon Elastic Container Service (Amazon ECS) and AWS Fargate along with open source Strands Agents SDK. Oh, and I will do this with less than 100 lines of new code and absolutely no code changes to the existing application.
Step 1: Understanding Your Application’s API Surface
The key to success with this approach is having a well-documented REST API with an OpenAPI specification. For this example, I will use the Swagger Petstore which is a widely-used sample API that serves as a demonstration of OpenAPI documentation and testing. It presents a simplified e-commerce scenario where users can interact with a pet store, showcasing various REST endpoints. However, this pattern works with any application that exposes an OpenAPI specification. From legacy CRM systems to modern microservices driving single page applications through REST endpoints, the pattern is the same.
The kinds of functionality that AI agents can leverage include:
- Creating, querying and updating records based on natural language requests
- Performing complex workflows that span multiple API calls
- Generating insights from existing data combined with the underlying model general knowledge
The documentation within the API specification is critical as it explains to the AI Agent what each API endpoint does and how to interact with it.
Step 2: Wrap Your API with FastMCP
To make any REST API usable by AI agents, we need to expose it as tools via the Model Context Protocol (MCP). FastMCP is a lightweight server that takes an OpenAPI specification and automatically converts API endpoints into tools that AI agents can understand and call.
Here’s how we create an MCP server that wraps any existing API:
import httpx
import os
from fastmcp import FastMCP
from fastmcp.server.openapi import RouteMap, MCPType
# Get API URL from environment variable or use default
API_BASE_URL = os.environ.get("API_BASE_URL", "https://petstore3.swagger.io")
API_PATH = os.environ.get("API_PATH", "/api/v3/openapi.json")
# Create an HTTP client for your API
client = httpx.AsyncClient(base_url=f"{API_BASE_URL}{API_PATH}")
# Load your OpenAPI spec
openapi_spec = httpx.get(f"{API_BASE_URL}{API_PATH}").json()
# Create the MCP server
mcp = FastMCP.from_openapi(
openapi_spec=openapi_spec,
route_maps=[
RouteMap(mcp_type=MCPType.TOOL)
],
client=client,
name="Pet Store API Server",
port=8000
)
if __name__ == "__main__":
mcp.run(transport='streamable-http', host='0.0.0.0')
This approach keeps the MCP server stateless and configuration driven. Simply point it at any API with an OpenAPI specification, and it automatically creates the tool interface that AI agents need.
*Note: The RouteMap section is required since by default no argument GET endpoints map to MCP resources, which is, at the time of this writing, not supported by Strands Agents SDK.
Step 3: Deploying to Amazon Elastic Container Service and AWS Fargate
To deploy both your existing application and the MCP server together, I will use Amazon Elastic Container Service (Amazon ECS) and AWS Fargate with a sidecar pattern. A container sidecar is a pattern where a helper container runs next to your main application container, sharing the same pod and network space. This allows your legacy application and the MCP server to run in the same task definition using awsvpc networking, ensuring low-latency communication while maintaining separation of concerns.
Dockerfile for the MCP server that is used to build the container image and then push to Amazon Elastic Container Registry (ECR).
Here’s the key CloudFormation snippet that defines the container setup:
TaskDefinition:
Type: AWS::ECS::TaskDefinition
Properties:
Family: !Sub ${EnvironmentName}-task
Cpu: '2048'
Memory: '4096'
NetworkMode: awsvpc
RequiresCompatibilities:
- FARGATE
ExecutionRoleArn: !Ref ECSTaskExecutionRole
TaskRoleArn: !Ref ECSTaskRole
ContainerDefinitions:
- Name: swagger-petstore
Image: my-ecr-repo/swagger-petstore:latest
PortMappings:
- ContainerPort: 8080
HealthCheck:
Command:
- CMD-SHELL
- "curl -f http://localhost:8080/api/v3/openapi.json || exit 1"
StartPeriod: 60
- Name: fast-mcp
Image: my-ecr-repo/fast-mcp:latest
DependsOn:
- ContainerName: swagger-petstore
Condition: HEALTHY
Environment:
- Name: API_BASE_URL
Value: "http://localhost:8080"
PortMappings:
- ContainerPort: 8000
ECSService:
Type: AWS::ECS::Service
DependsOn:
- ALBHTTPListener
- FastMCPListenerRule
Properties:
ServiceName: !Sub ${EnvironmentName}-service
Cluster: !Ref ECSCluster
TaskDefinition: !Ref TaskDefinition
DesiredCount: 1
LaunchType: FARGATE
NetworkConfiguration:
AwsvpcConfiguration:
AssignPublicIp: DISABLED
SecurityGroups:
- !Ref ECSSecurityGroup
Subnets:
- !Ref PrivateSubnet1
- !Ref PrivateSubnet2
LoadBalancers:
- ContainerName: swagger-petstore
ContainerPort: 8080
TargetGroupArn: !Ref PetstoreTargetGroup
- ContainerName: fast-mcp
ContainerPort: 8000
TargetGroupArn: !Ref FastMCPTargetGroup
HealthCheckGracePeriodSeconds: 300
This CloudFormation configuration defines both containers in a single ECS task with awsvpc networking. They share the same network interface and can communicate via localhost internally, while the MCP server is exposed through an Application Load Balancer for external access.
Step 4: Building the AI Agent with Strands Agents SDK
Now we’ll create an AI agent using Strands Agents SDK that can interact with your legacy application through the MCP server. Strands provides a clean interface for connecting to MCP servers and integrating with Amazon Bedrock:
from strands import Agent
from strands.tools.mcp import MCPClient
from mcp.client.streamable_http import streamablehttp_client
from strands.types.exceptions import MCPClientInitializationError
from strands.models import BedrockModel
import sys
import os
from dotenv import load_dotenv
def main():
load_dotenv()
mcp_server_url = os.environ.get("MCP_SERVER_URL", "http://myalb.elb.amazonaws.com/mcp/")
bedrock_model = BedrockModel(
model_id="us.amazon.nova-micro-v1:0",
temperature=0.3,
streaming=True,
)
print(f"\nConnecting to MCP Server at {mcp_server_url}...")
mcp_client = MCPClient(lambda: streamablehttp_client(mcp_server_url))
with mcp_client:
tools = mcp_client.list_tools_sync()
agent = Agent(model=bedrock_model, tools=tools)
print("🧠 Pet Store Agent CLI - Type 'exit' to quit")
while True:
prompt = input("🗣️ You: ")
if prompt.lower() in {"exit", "quit"}:
print("👋 Bye!"); break
response = agent(prompt)
print("🤖 Agent:", response)
if __name__ == "__main__":
main()
This agent automatically discovers all available tools from your MCP server and makes them available to the AI model. The agent can now understand natural language requests and translate them into appropriate API calls.
Example Intelligent Interaction
Here’s what a conversation with your modernized application might look like when running the agent code above:
So, what is going on here?
The user here is interacting with the AI agent created with Strands Agent SDK that is utilizing the Amazon Nova model using Amazon Bedrock. When the user asks about adopting a pet, instead of using the agent general knowledge and directing to public pet adoption websites, the agent is aware of the MCP tools that have been made available and through the MCP server. The agent then determines, based on the OpenAPI documentation, that it can find pets that are available for adoption and so leverages the existing application REST endpoints and gets a list of pets. The agent then combines data from the legacy application with the model’s general knowledge and describes the dog breed characteristics.
Making It Work with Your Applications
To adapt this pattern for your existing applications:
- Identify APIs with OpenAPI specs – Look for applications that already expose REST APIs with documentation
- Evaluate which APIs you want AI agents to have access to, you may not want to expose all functionality to agents
- Configure the MCP server – Point the FastMCP server at your API’s OpenAPI specification or a modified specification that limits which APIs are to be used by agents
- Handle authentication & authorization – Configure appropriate authentication mechanisms for your API. This is critical as you do not want to allow an agent to circumvent existing authentication & authorization.
- Deploy using the sidecar pattern – Use the CloudFormation template snippet above as a starting point
- Test and iterate – Start with simple interactions and gradually expand the agent’s capabilities
Benefits of This Approach
This pattern offers several key advantages:
- Minimal changes to existing applications– Your legacy systems continue running unchanged
- Incremental modernization– Add AI capabilities without a complete rewrite
- Scalable architecture– The sidecar pattern allows independent scaling and updates
- Secure by design– Authentication and authorization remain with your existing systems
- Cost-effective– Leverage existing investments while adding new capabilities
Next Steps
To get started with modernizing your own applications:
- Review the Strands Agents SDK documentation for detailed implementation guidance
- Explore Amazon Bedrock for available AI models and regional availability
- Identify applications in your organization with well-documented REST APIs
- Start with a proof of concept using a non-critical application
- Consider authentication and security requirements for production deployment
- Consider limiting which API endpoints the agent has access to through the FastMCP configuration options
Conclusion
By combining MCP and Strands Agents SDK you can breathe new life into existing applications without major architectural changes. This approach allows you to modernize gradually while preserving your investments in proven business systems.
The pattern I’ve demonstrated here provides a foundation that scales from simple proof-of-concepts to enterprise-grade deployments. As AI capabilities continue to evolve, having this integration layer in place positions your organization to quickly adopt new features and capabilities.
Remember that this is just the beginning. Once you have AI agents that can interact with your existing applications, you can explore more sophisticated workflows, multi-system integrations, and autonomous business processes that were previously impossible or prohibitively expensive to implement.