AWS for M&E Blog
Build an intelligent video streaming monitoring solution with CMCD and MCP
Are you struggling to diagnose video streaming issues across your content delivery network (CDN)? Video streaming operators often spend hours analyzing complex log data to identify and resolve playback problems that impact the user experience. We will show you how to build an intelligent monitoring solution that combines Common Media Client Data (CMCD) with the power of large language models (LLMs) using Model Context Protocol (MCP), making it quicker to analyze streaming performance using natural language queries.
Introduction
CMCD provides video streaming operators a way to monitor streams at scale by using Amazon CloudFront real-time logs. When end users are consuming the video, each request from the video player to the CDN for a video chunk contains a set of data outlining the end-user experience. The data typically contains information like the bitrate of the video, the current bandwidth of the end user and the length of the buffer. This data is extracted from the logs in real time and surfaced to video operators through dashboards for monitoring streams.
Video streaming operators rely heavily on the monitoring dashboards that visualize essential performance indicators (such as buffer ratios, startup times, and quality switches). These dashboards serve as the first line of defense, highlighting system-wide trends and potential anomalies across the streaming platform. However, when troubleshooting complex streaming issues that affect specific user segments or geographic regions, operators need to dive deeper than aggregated metrics. They need to analyze the detailed CMCD event logs that capture individual viewer experiences. This granular investigation requires sophisticated data queries to sift through millions of viewing sessions, which can be challenging to construct and optimize.
Large language models have emerged as powerful tools that operators can use to formulate complex queries using natural language, significantly streamlining the troubleshooting process. MCP takes this capability further by establishing a standardized framework for applications to communicate context to LLMs.
Solution overview
Through our CMCD MCP server implementation, we’ve created a specialized interface that manages the interaction between LLMs and streaming telemetry data. This approach eliminates the need for operators to manually provide database schemas and CMCD specifications with each query. They can focus on resolving streaming issues.
This integration demonstrates how generative AI can transform streaming media operations, making sophisticated data analysis more accessible and efficient for streaming operators. The solution is available for download on our github repository. Be certain to follow the instructions for proper infrastructure deployment and configuration.
Video is served to the end user using CloudFront, which is configured for real-time log delivery using Amazon Kinesis Data Streams. Logs are processed by AWS Lambda in real time that writes the video streaming data to Amazon Timestream for InfluxDB. As a security best practice, the InfluxDB is deployed in a private subnet, only accessible by a bastion host in the public subnet. The MCP server is used by Amazon Q Developer CLI (MCP Client) to respond to user queries about the CMCD data.
Diving into the MCP server, it contains a set of tools which signal to Amazon Q Developer CLI about the capabilities of the MCP server. In our code sample, we expose a small set of tools (get_average_bitrate, get_session_details, analyze_buffer_events, and identify_playback_errors) that Amazon Q Developer uses to answer queries.
When a user asks a question about the video streams in Amazon Q Developer, it looks at the available MCP server and receives a list of tools. Amazon Q Developer then takes the prompt, and the list of tools, and uses an LLM to determine the best tool to answer the input prompt. Once the appropriate tool is determined, Amazon Q Developer uses the MCP server to execute the tool, obtain the results, then asks the LLM to summarize the answer before responding to the end user.
The MCP server signals to Amazon Q Developer about its capabilities when the server is initialized, as shown in the following code block:
Tools are identified by the @mcp.tool
annotation in the MCP server, which also contains the description of the tool—signaling to the LLM the capability of the tool.
While this implementation uses Amazon Q Developer CLI as the primary interface for troubleshooting streaming issues, the MCP’s flexibility enables broader integration possibilities. Developers can create custom applications using the Strands SDK, Langchain, or any tool that supports MCP to build specialized interfaces for their streaming analytics needs. You can also write your own MCP Client for fine-grained control over how the data is consumed, and which LLM is used for processing queries and calling tools. The GitHub repository includes an example of an MCP client using Amazon Nova foundation models to process user prompts and itemize available tools.
Through natural language queries you can ask things such as, “what is the average bitrate for my sessions” and “list sessions”. This can be followed by detailed analysis requests such as, “analyze session <xxxx>”. The MCP server acts as a bridge between your data source and the LLM, enabling quick and intuitive CMCD data analysis. With broad client compatibility, you can leverage any MCP-compatible interface to query your streaming data, whether through the Q Developer CLI, Strands SDK-based applications, or custom implementations.
Conclusion
We discussed how to build and deploy a MCP server for consuming Common Media Client Data metrics. Customers can consume CMCD data using natural language with any large language models by leveraging MCP.
In our next blog, we’ll add multi-agentic capabilities using Strands SDK and MCP servers to automatically diagnose end-user issues across multiple systems.
Contact an AWS Representative to know how we can help accelerate your business.