Overview

Product video
LangGraph Platform is a purpose-built infrastructure and management layer for deploying and scaling long-running, stateful agents -- offering:
- 1-click deployment to go live in minutes,
- 30 API endpoints for designing custom user experiences that fit any interaction pattern
- Horizontal scaling to handle bursty, long-running traffic
- A persistence layer to support memory, conversational history, and async collaboration with human-in-the-loop or multi-agent workflows
- Native LangGraph Studio, the agent IDE, for easy debugging, visibility, and iteration.
LangSmith is a unified observability & evals platform where teams can debug, test, and monitor AI app performance - whether building with LangChain or not.
Find failures fast with agent observability. Quickly debug and understand non-deterministic LLM app behavior with tracing. See what your agent is doing step by step, then fix issues to improve latency and response quality.
Evaluate your agent's performance. Evaluate your app by saving production traces to datasets, then score performance with LLM-as-Judge evaluators. Gather human feedback from subject-matter experts to assess response relevance, correctness, harmfulness, and other criteria.
Experiment with models and prompts in the Playground, and compare outputs across different prompt versions. Any teammate can use the Prompt Canvas UI to directly recommend and improve prompts.
Track business-critical metrics like costs, latency, and response quality with live dashboards, then get alerted when problems arise and drill into root cause.
Highlights
- Please note: there is a minimum $50k annual usage commitment to access this package. To discuss enterprise pricing or to activate your commitment and obtain your license key after signup, please contact us at https://www.langchain.com/contact-sales - alternatively, our self-serve cloud-based products are available at https://www.langchain.com
- Accelerate agent development. Build, debug, and iterate visually with LangGraph Studio, the agent IDE. Find failures fast with agent observability. Quickly debug and understand non-deterministic LLM app behavior with tracing. See what your agent is doing step by step -- then fix issues to improve latency and response quality. With AI Observability, you can see topics, response quality, and agent trajectory with little setup.
- Evaluate your agent's performance. Evaluate your app by saving production traces to datasets -- then score performance with LLM-as-Judge evaluators. Gather human feedback from subject-matter experts to assess response relevance, correctness, harmfulness, and other criteria.
Details
Unlock automation with AI agent solutions

Features and programs
Financing for AWS Marketplace purchases
Pricing
Dimension | Description | Cost/unit |
---|---|---|
Unit for LangSmith | Unit for LangSmith | $0.005 |
Unit for LangGraph Platform | Unit for LangGraph Platform | $0.001 |
Metered Usage Amount | Metered Usage Amount | $0.01 |
Minimum Annual Commitment | Minimum annual usage commitment, billed in advance | $50,000.00 |
Vendor refund policy
https://www.langchain.com/terms-of-service#:~:text=Customer%20will%20pay%20LangChain%20all ,Fees%20paid%20are%20non%2Drefundable.
How can we make this page better?
Legal
Vendor terms and conditions
Content disclaimer
Delivery details
LangSmith Helm Deployment
- Amazon EKS
Helm chart
Helm charts are Kubernetes YAML manifests combined into a single package that can be installed on Kubernetes clusters. The containerized application is deployed on a cluster by running a single Helm install command to install the seller-provided Helm chart.
Version release notes
This release brings alerting, UI-driven experiment workflows, end-to-end OpenTelemetry support and a host of new capabilities alongside several bug fixes. Under the hood it also rolls out beta Self-Hosted LangGraph Cloud Control Plane and full-text search, plus query performance and ingestion optimizations.
Additional details
Usage instructions
To use your instance follow these instructions:
Resources
Vendor resources
Support
Vendor support
AWS infrastructure support
AWS Support is a one-on-one, fast-response support channel that is staffed 24x7x365 with experienced and technical support engineers. The service helps customers of all sizes and technical abilities to successfully utilize the products and features provided by Amazon Web Services.