Overview
It builds process aware contextualization layer (knowledge graph) based on P&IDs to enable traceability and process diagnostics. Contextualized data includes - SAP PM, MM, timeseries parameters, manuals (operating manuals, maintenance manuals, SOPs, design sheets) and others.
Highlights
- Domain Contextualization
- Knowledge Graph
- EDA (Exploratory Data Analysis)
Details
Unlock automation with AI agent solutions

Features and programs
Financing for AWS Marketplace purchases
Pricing
- $30,000.00/month
Vendor refund policy
Refund Terms: Refund requests must be made within 7 days of purchase. Refunds are not applicable for custom deployments or enterprise agreements. Buyers must submit a refund request via email or support portal with proof of purchase. Refunds will be processed within 30 business days after approval. Refunds will be issued via the original payment method. For refund requests or support, please contact us: email: shubham.hembade@tridiagonal.aiÂ
How can we make this page better?
Legal
Vendor terms and conditions
Content disclaimer
Delivery details
Cloudformation Deployment
- Amazon Bedrock AgentCore - Preview
Container image
Containers are lightweight, portable execution environments that wrap server application software in a filesystem that includes everything it needs to run. Container applications run on supported container runtimes and orchestration services, such as Amazon Elastic Container Service (Amazon ECS) or Amazon Elastic Kubernetes Service (Amazon EKS). Both eliminate the need for you to install and operate your own container orchestration software by managing and scheduling containers on a scalable cluster of virtual machines.
Version release notes
This release introduces the Knowledge Accelerator, a data contextualization engine designed to convert raw industrial and enterprise data into an enriched Knowledge Graph. The system ingests information from SAP systems, Operating Manuals, SOPs, Root Cause Analyses (RCA), and Maintenance Manuals, enabling intelligent asset insights and semantic search capabilities.
Additional details
Usage instructions
Prerequisites AWS account with ECR permissions, CloudFormation permissions. 1 IAM role with VPC, EC2, S3, Secrets Manager permissions.
Step 1: Pull the CloudFormation template from Git
Pull the CloudFormation template from Git using below repository link: https://github.com/medt-pds/AIAgentForManufacturingDataContextualization_Cloudformation_Template.gitÂ
Copy the CloudFormation template in a S3 bucket.
Step 2: Deploy the CloudFormation template
Login to AWS console.
Go to Cloudformation console.
Click on create stack and select with New resources Standard.
Specify template choose from the following 1. S3 URL
Configure stack details: Stack name.
Enter values for below parameters. InstanceType KeyName AvailabilityZone1 (e.g us-east-1a) AvailabilityZone2 (e.g us-east-1b) EnvironmentName Neo4jEC2AMIID RegionName LLMRegionName
Configure rollback triggers to revert changes in case of errors.
Set stack options: Tag Permissions Stack policy
Select both the options available under section Capabilities.
Review and deploy
Monitor stack creation.
Check the following services has been provisioned or not: 1 VPC, 1 EC2, 1 S3 bucket, 1 Secrets Manager
Step 3: S3 bucket structure for raw data
Before running the Docker image, ensure that the raw data is available in an S3 bucket following the required structure. You can download the sample S3 bucket structure from the following Git repository: https://github.com/medt-pds/AIAgentForManufacturingDataContextualization_S3_Structure.gitÂ
Step 4: Subscribe and Pull Image
Subscribe to "AI Agent For Manufacturing Data Contextualization" on AWS Marketplace
Authenticate Docker to ECR: aws ecr get-login-password --region us-east-1 | docker login --username AWS --password-stdin <marketplace-ecr-uri>
Pull the image: docker pull <marketplace-ecr-uri>/709825985650.dkr.ecr.us-east-1.amazonaws.com/domain_aware_data_contextualization_agent_repo:domain_aware_data_contextualization_agent
Step 5: Deploy Container docker run -e AWS_ACCESS_KEY_ID=<access_key> -e AWS_SECRET_ACCESS_KEY=<secret_access_key> -e REGION_NAME=<region_name> -p 8080:8080 --name <container_name> <image_id>
Step 6: Verify Deployment
Check if the docker image is running using postman: - Request: Get method http://localhost:8080/ping response: {"status":"ok", "timestamp": <current time in epoch format>}
Check if the Knowledge graph is created in Neo4j hosted on an EC2 using below link: - http://<public-ip-of-EC2-instance>:7474
AWS Bedrock AgentCore:
aws bedrock-agentcore-control create-agent-runtime
--agent-runtime-name "DataContextualization"
--agent-runtime-artifact '{"containerConfiguration": {"containerUri": "<your-ecr-uri>"}}'
--environment-variables '{"ACCESS_KEY_ID": "", "AWS_SECRET_ACCESS_KEY": "", "REGION_NAME": "***"}'
Note: Before running the Docker image, initial configuration is required based on the type of input data. This configuration depends on the specific data sources such as SAP, Maintenance Manuals, SOPs, Operating Manuals, P&IDs, and RCAs.
Support
Vendor support
AWS infrastructure support
AWS Support is a one-on-one, fast-response support channel that is staffed 24x7x365 with experienced and technical support engineers. The service helps customers of all sizes and technical abilities to successfully utilize the products and features provided by Amazon Web Services.