Migration & Modernization

Enabling AI agents to interact with existing applications

When I talk to organizations looking to adopt AI-driven workflows, one question keeps coming up: how can we take the applications we’ve already invested in and make them usable by AI agents? The answer lies in a pattern that bridges modern AI with traditional business systems by exposing your existing applications as tools via the Model Context Protocol (MCP) to unlock new potential with minimal effort. MCP is a universal standard that enables AI models to seamlessly interact with external tools, data sources, and applications, effectively serving as a bridge between AI systems and the broader digital ecosystem.

In this blog, I will walk through how to modernize existing REST APIs with a 4-step process and make their capabilities available to an AI agent using MCP and AWS-native tools like Amazon Elastic Container Service (Amazon ECS) and AWS Fargate along with open source Strands Agents SDK. Oh, and I will do this with less than 100 lines of new code and absolutely no code changes to the existing application.

Step 1: Understanding Your Application’s API Surface

The key to success with this approach is having a well-documented REST API with an OpenAPI specification. For this example, I will use the Swagger Petstore which is a widely-used sample API that serves as a demonstration of OpenAPI documentation and testing. It presents a simplified e-commerce scenario where users can interact with a pet store, showcasing various REST endpoints. However, this pattern works with any application that exposes an OpenAPI specification. From legacy CRM systems to modern microservices driving single page applications through REST endpoints, the pattern is the same.

The kinds of functionality that AI agents can leverage include:

  • Creating, querying and updating records based on natural language requests
  • Performing complex workflows that span multiple API calls
  • Generating insights from existing data combined with the underlying model general knowledge

The documentation within the API specification is critical as it explains to the AI Agent what each API endpoint does and how to interact with it.

Step 2: Wrap Your API with FastMCP

To make any REST API usable by AI agents, we need to expose it as tools via the Model Context Protocol (MCP). FastMCP is a lightweight server that takes an OpenAPI specification and automatically converts API endpoints into tools that AI agents can understand and call.

Here’s how we create an MCP server that wraps any existing API:

import httpx
import os
from fastmcp import FastMCP
from fastmcp.server.openapi import RouteMap, MCPType

# Get API URL from environment variable or use default
API_BASE_URL = os.environ.get("API_BASE_URL", "https://petstore3.swagger.io")
API_PATH = os.environ.get("API_PATH", "/api/v3/openapi.json")

# Create an HTTP client for your API
client = httpx.AsyncClient(base_url=f"{API_BASE_URL}{API_PATH}")

# Load your OpenAPI spec 
openapi_spec = httpx.get(f"{API_BASE_URL}{API_PATH}").json()

# Create the MCP server
mcp = FastMCP.from_openapi(
    openapi_spec=openapi_spec,
    route_maps=[
        RouteMap(mcp_type=MCPType.TOOL)
    ],
    client=client,
    name="Pet Store API Server",
    port=8000
)

if __name__ == "__main__":
    mcp.run(transport='streamable-http', host='0.0.0.0')

This approach keeps the MCP server stateless and configuration driven. Simply point it at any API with an OpenAPI specification, and it automatically creates the tool interface that AI agents need.

*Note: The RouteMap section is required since by default no argument GET endpoints map to MCP resources, which is, at the time of this writing, not supported by Strands Agents SDK.

Step 3: Deploying to Amazon Elastic Container Service and AWS Fargate

To deploy both your existing application and the MCP server together, I will use Amazon Elastic Container Service (Amazon ECS) and AWS Fargate with a sidecar pattern. A container sidecar is a pattern where a helper container runs next to your main application container, sharing the same pod and network space. This allows your legacy application and the MCP server to run in the same task definition using awsvpc networking, ensuring low-latency communication while maintaining separation of concerns.

Dockerfile for the MCP server that is used to build the container image and then push to Amazon Elastic Container Registry (ECR).

FROM python:3.13-slim

WORKDIR /app

# Copy requirements file
COPY requirements.txt .

# Install dependencies
RUN pip install --no-cache-dir -r requirements.txt

# Copy application code
COPY my_mcp_server.py .

# Expose the port the app runs on
EXPOSE 8080

# Command to run the application
CMD ["python", "my_mcp_server.py"]

Here’s the key CloudFormation snippet that defines the container setup:

  TaskDefinition:
    Type: AWS::ECS::TaskDefinition
    Properties:
      Family: !Sub ${EnvironmentName}-task
      Cpu: '2048'
      Memory: '4096'
      NetworkMode: awsvpc
      RequiresCompatibilities:
        - FARGATE
      ExecutionRoleArn: !Ref ECSTaskExecutionRole
      TaskRoleArn: !Ref ECSTaskRole
      ContainerDefinitions:
        - Name: swagger-petstore
          Image: my-ecr-repo/swagger-petstore:latest
          PortMappings:
            - ContainerPort: 8080
          HealthCheck:
            Command:
              - CMD-SHELL
              - "curl -f http://localhost:8080/api/v3/openapi.json || exit 1"
            StartPeriod: 60
        - Name: fast-mcp
          Image: my-ecr-repo/fast-mcp:latest
          DependsOn:
            - ContainerName: swagger-petstore
              Condition: HEALTHY
          Environment:
            - Name: API_BASE_URL
              Value: "http://localhost:8080"
          PortMappings:
            - ContainerPort: 8000

  ECSService:
    Type: AWS::ECS::Service
    DependsOn:
      - ALBHTTPListener
      - FastMCPListenerRule
    Properties:
      ServiceName: !Sub ${EnvironmentName}-service
      Cluster: !Ref ECSCluster
      TaskDefinition: !Ref TaskDefinition
      DesiredCount: 1
      LaunchType: FARGATE
      NetworkConfiguration:
        AwsvpcConfiguration:
          AssignPublicIp: DISABLED
          SecurityGroups:
            - !Ref ECSSecurityGroup
          Subnets:
            - !Ref PrivateSubnet1
            - !Ref PrivateSubnet2
      LoadBalancers:
        - ContainerName: swagger-petstore
          ContainerPort: 8080
          TargetGroupArn: !Ref PetstoreTargetGroup
        - ContainerName: fast-mcp
          ContainerPort: 8000
          TargetGroupArn: !Ref FastMCPTargetGroup
      HealthCheckGracePeriodSeconds: 300

This CloudFormation configuration defines both containers in a single ECS task with awsvpc networking. They share the same network interface and can communicate via localhost internally, while the MCP server is exposed through an Application Load Balancer for external access.

Step 4: Building the AI Agent with Strands Agents SDK

Now we’ll create an AI agent using Strands Agents SDK that can interact with your legacy application through the MCP server. Strands provides a clean interface for connecting to MCP servers and integrating with Amazon Bedrock:

from strands import Agent
from strands.tools.mcp import MCPClient
from mcp.client.streamable_http import streamablehttp_client
from strands.types.exceptions import MCPClientInitializationError
from strands.models import BedrockModel
import sys
import os
from dotenv import load_dotenv

def main():
    load_dotenv()
    
    mcp_server_url = os.environ.get("MCP_SERVER_URL", "http://myalb.elb.amazonaws.com/mcp/")

    bedrock_model = BedrockModel(
      model_id="us.amazon.nova-micro-v1:0",
      temperature=0.3,
      streaming=True, 
    )

    print(f"\nConnecting to MCP Server at {mcp_server_url}...")

    mcp_client = MCPClient(lambda: streamablehttp_client(mcp_server_url))

    with mcp_client:
        tools = mcp_client.list_tools_sync()
        agent = Agent(model=bedrock_model, tools=tools)
    
        print("🧠 Pet Store Agent CLI - Type 'exit' to quit")
        while True:
            prompt = input("🗣️  You: ")
            if prompt.lower() in {"exit", "quit"}:
                print("👋 Bye!"); break
            response = agent(prompt)
            print("🤖 Agent:", response)

if __name__ == "__main__":
    main()

This agent automatically discovers all available tools from your MCP server and makes them available to the AI model. The agent can now understand natural language requests and translate them into appropriate API calls.

Example Intelligent Interaction

Here’s what a conversation with your modernized application might look like when running the agent code above:

🧠 Pet Store Agent CLI - Type 'exit' to quit
🗣️  You: I would like to adopt a dog that is good with children.
I'd be happy to help you find a dog that's good with children. To find available dogs in the pet store system, I can search for pets with the "available" status, which would show dogs that are ready for adoption.

Let me check for available pets:
Tool #1: findPetsByStatus
I found several available pets, and among them, I found a dog named Biscuit (ID: 4) that's specifically tagged as "Good with children"! This seems like a perfect match for what you're looking for.

Here are the details about Biscuit:
- Name: Biscuit
- Status: Available for adoption
- Tags: 
  - Goldendoodle (breed)
  - 4 years old
  - Good with children

Would you like to get more specific information about Biscuit? I can help you with the adoption process if you're interested in this dog.
🗣️  You: Yes, please get me detailed information about Biscuit as well as any information on the Goldendoodle breed.

🤖 Agent: Here's the detailed information about Biscuit:

**Biscuit's Profile:**
- **Name:** Biscuit
- **ID:** 4
- **Age:** 4 years old
- **Status:** Available for adoption
- **Photos:** Two photos available (url1, url2)
- **Special qualities:** Good with children

**About Goldendoodles:**
Goldendoodles are a popular mixed breed - a cross between a Golden Retriever and a Poodle. They are known for several desirable traits:

1. **Family-friendly:** Goldendoodles typically inherit the gentle, friendly nature of Golden Retrievers, making them excellent family pets and great with children.

2. **Intelligence:** They get their smarts from both parent breeds, making them highly trainable and responsive.

3. **Low-shedding coat:** Many Goldendoodles inherit the Poodle's low-shedding coat, which makes them a good choice for people with mild allergies (though no dog is truly 100% hypoallergenic).

4. **Adaptable size:** Depending on the Poodle parent (standard, miniature, or toy), Goldendoodles can range from medium to large size.

5. **Energy level:** They typically have a moderate to high energy level and enjoy activities with their family. They need regular exercise and mental stimulation.

6. **Lifespan:** Goldendoodles generally live between 10-15 years.

Biscuit, at 4 years old, is in the prime of her life and has likely already developed her adult personality and behavior patterns, which includes being good with children. This is a great age because she's likely past the more challenging puppy phase but still has many years of companionship to offer.

Would you like to proceed with the adoption process for Biscuit or do you have any other questions about her? 

So, what is going on here?

The user here is interacting with the AI agent created with Strands Agent SDK that is utilizing the Amazon Nova model using Amazon Bedrock. When the user asks about adopting a pet, instead of using the agent general knowledge and directing to public pet adoption websites, the agent is aware of the MCP tools that have been made available and through the MCP server. The agent then determines, based on the OpenAPI documentation, that it can find pets that are available for adoption and so leverages the existing application REST endpoints and gets a list of pets. The agent then combines data from the legacy application with the model’s general knowledge and describes the dog breed characteristics.

Making It Work with Your Applications

To adapt this pattern for your existing applications:

  1. Identify APIs with OpenAPI specs – Look for applications that already expose REST APIs with documentation
  2. Evaluate which APIs you want AI agents to have access to, you may not want to expose all functionality to agents
  3. Configure the MCP server – Point the FastMCP server at your API’s OpenAPI specification or a modified specification that limits which APIs are to be used by agents
  4. Handle authentication & authorization – Configure appropriate authentication mechanisms for your API. This is critical as you do not want to allow an agent to circumvent existing authentication & authorization.
  5. Deploy using the sidecar pattern – Use the CloudFormation template snippet above as a starting point
  6. Test and iterate – Start with simple interactions and gradually expand the agent’s capabilities

Benefits of This Approach

This pattern offers several key advantages:

  • Minimal changes to existing applications– Your legacy systems continue running unchanged
  • Incremental modernization– Add AI capabilities without a complete rewrite
  • Scalable architecture– The sidecar pattern allows independent scaling and updates
  • Secure by design– Authentication and authorization remain with your existing systems
  • Cost-effective– Leverage existing investments while adding new capabilities

Next Steps

To get started with modernizing your own applications:

  1. Review the Strands Agents SDK documentation for detailed implementation guidance
  2. Explore Amazon Bedrock for available AI models and regional availability
  3. Identify applications in your organization with well-documented REST APIs
  4. Start with a proof of concept using a non-critical application
  5. Consider authentication and security requirements for production deployment
  6. Consider limiting which API endpoints the agent has access to through the FastMCP configuration options

Conclusion

By combining MCP and Strands Agents SDK you can breathe new life into existing applications without major architectural changes. This approach allows you to modernize gradually while preserving your investments in proven business systems.

The pattern I’ve demonstrated here provides a foundation that scales from simple proof-of-concepts to enterprise-grade deployments. As AI capabilities continue to evolve, having this integration layer in place positions your organization to quickly adopt new features and capabilities.

Remember that this is just the beginning. Once you have AI agents that can interact with your existing applications, you can explore more sophisticated workflows, multi-system integrations, and autonomous business processes that were previously impossible or prohibitively expensive to implement.