
Overview

Product video
Comet's machine learning platform integrates with your existing infrastructure and tools so you can manage, visualize, and optimize model - from training runs to production monitoring.
Add two lines of code to your notebook or script and automatically start tracking code, hyperparameters, metrics, and more, so you can compare and reproduce training runs.
Comet helps ML teams: -Track and share training run results in real time. -Build their own tailored, interactive visualizations. -Track and version datasets and artifacts. -Manage their models and trigger deployments. -Monitor their models in production.
Comet's platform supports some of the world's most innovative enterprise teams deploying deep learning at scale and is used by ML teams at Uber, Zappos, Shopify, Affirm, Etsy, Ancestry.com and ML leaders across all industries.
For custom pricing, MSA, or a private contract, please contract AWS-Marketplace@comet.com for a private offer.
Highlights
- Track and share training run results in real time: Comet's ML platform gives you visibility into training runs and models so you can iterate faster.
- Manage your models and trigger deployments: Comet Model Registry allows you to keep track of your models ready for deployment. Thanks to the tight integration with Comet Experiment Management, you will have full lineage from training to production.
- Monitor your models in production: The performance of models deployed to production degrade over time, either due to drift or data quality. Use Comet's machine learning platform to identify drift and track accuracy metrics using baselines automatically pulled from training runs.
Details
Introducing multi-product solutions
You can now purchase comprehensive solutions tailored to use cases and industries.
Features and programs
Financing for AWS Marketplace purchases
Pricing
Dimension | Description | Cost/12 months |
|---|---|---|
Advanced Package | Experiment Management, Model Registry, Monitoring | $4,500.00 |
Vendor refund policy
Non-Refundable. Unless otherwise expressly provided for in this agreement or the applicable Order Form, (i) all fees are based on services purchased and not on actual use; and (ii) all fees paid under this agreement are non-refundable.
How can we make this page better?
Legal
Vendor terms and conditions
Content disclaimer
Delivery details
Software as a Service (SaaS)
SaaS delivers cloud-based software applications directly to customers over the internet. You can access these applications through a subscription model. You will pay recurring monthly usage fees through your AWS bill, while AWS handles deployment and infrastructure management, ensuring scalability, reliability, and seamless integration with other AWS services.
Resources
Vendor resources
Support
AWS infrastructure support
AWS Support is a one-on-one, fast-response support channel that is staffed 24x7x365 with experienced and technical support engineers. The service helps customers of all sizes and technical abilities to successfully utilize the products and features provided by Amazon Web Services.

Standard contract
Customer reviews
Streamlined experiment tracking has improved collaboration and accelerated data workflows
What is our primary use case?
In my current organization, we are using Comet for monitoring and automation purposes. We use Comet to monitor our data pipelines and automated workflows in real-time, and it alerts us when a scheduled job fails or when performance drops below the threshold. This allows the team to quickly investigate the logs, identify the root cause, and trigger corrective actions without manual interventions.
How has it helped my organization?
Comet has had a very positive impact on my organization, mainly by bringing structure, visibility, and consistency to our workflow. The key improvements I have seen so far are faster experimental cycles. The team can spend less time tracking results manually and more time improving models, which has significantly reduced iteration time. Better reproducibility and reliability features are evident because every experiment is well documented, making it easy to reproduce results and avoid machine-related issues. The shared dashboard and experiment history gives everyone in my team a single source of truth, reducing back-and-forth communication and misalignment between teams. Additionally, visual comparisons and tracked metrics help my team and me confidently choose which models or approaches to move forward with.
What is most valuable?
The first Comet feature I am using from my experience is experimental tracking, so metrics, code version, artifacts, and output are captured automatically. This makes it easy to compare runs and reproduce results. The second feature I appreciate most is the visual dashboards and charts because it provides interactive charts and graphs to visualize training curves and metrics over time and parameter effects. It also helps us quickly spot trends, anomalies, or performance regressions.
Another valuable feature is the collaboration and sharing capability. The team can share experiments, dashboards, and results with links or permissions, which encourages transparency and faster iterations for our product.
The reason behind this is that experimental tracking is central to our workflow because it automatically captures parameters, metrics, code version, and output for everyone. This makes it easy to compare experiments and understand why one model performed better than another and reproduce results without manual logging. It also helps with errors and saves a lot of time when iterating quickly or handling handoffs between team members. Visual dashboards and collaboration are valuable, but experiment tracking is the foundation that everything else builds on.
One small but really helpful thing is how easy it is to add context to experiments. Features such as tags, notes, and metadata might seem minor, but they make a significant difference over time. Being able to tag runs or add quick notes about why a change was made helps me tremendously when I revisit experiments weeks or months later. This also makes onboarding new team members much smoother.
What needs improvement?
Overall, Comet has been very strong for us. There are a few areas where improvement can be made to make it even better. The first improvement I found is simpler onboarding for new users. While it is a powerful tool, some advanced features such as custom logging and alerts have a learning curve. More guided walkthroughs or product tips would help users become productive faster. Additionally, pre-built dashboards, alert rules, or experiment templates for common use cases would reduce setup time, especially for smaller teams.
There is also a need for improvement in large-scale experiment navigation. As the number of experiments grows, filtering and organizing runs could be even more powerful, especially for long-running projects. I believe that more customization in visualization is needed because the dashboards are very useful, but additional options for fine-grain customization would help tailor views for different stakeholders.
For how long have I used the solution?
I have been working in my current field for the last five years.
What do I think about the stability of the solution?
In our experience, Comet has been very stable and reliable. We have not faced any significant downtime that impacted our workflow. The platform handles concurrent experiments with large data volumes without major issues. Logging dashboards and artifacts storage have been consistently responsive even during high load periods. Occasionally, retrieving very large experiment histories or artifacts can take a few extra seconds, but this has not affected productivity or caused failures for us. When minor issues do arise, Comet's support team responds promptly, which helps maintain reliability.
What do I think about the scalability of the solution?
Comet's scalability has been very effective for our organization. It handles more experiments, and we can run hundreds of experiments concurrently without noticeable slowdowns. It also handles large datasets and models. As our team members onboard, project-level permissions, shared dashboards, and collaboration features scale effectively, keeping everyone aligned. Comet's cloud-based architecture automatically scales with use, so we have not needed to worry about provisioning or capacity limits.
How are customer service and support?
In my experience with Comet customer support, they have been positive and helpful. When we have reached out with questions or issues, the support team has responded in a timely manner. They have been effective at guiding us through troubleshooting, especially for configuration questions or edge case behaviors. The support representatives seem knowledgeable about the product and able to provide actionable guidance rather than generic responses. In a few cases, support helped clarify gaps in the documentation and even pointed us to resources we had not discovered.
How would you rate customer service and support?
Positive
Which solution did I use previously and why did I switch?
Before Comet, we were using a combination of manual spreadsheets, local logging scripts, and some basic experiment tracking tools. We switched to Comet because our previous setup had scattered metrics, artifacts, and code across multiple tools and folders, making reproducibility and collaboration difficult. It was hard to compare experiments or track progress across the team, leading to slower iteration cycles. Logging results, visualizing metrics, and sharing updates consumed a lot of time for us. Sharing insights with teammates and onboarding new members was cumbersome and error-prone as we experienced.
What was our ROI?
We have definitely seen a return on investment for using Comet in both tangible and intangible ways. Automating experiment tracking, logging, and reporting has freed up 30 to 40 percent of time that would otherwise be spent on manual documentation and comparisons. Teams can converge on optimal models more quickly, reducing overall project timelines by roughly 20 to 25 percent. Additionally, shared dashboards and notes reduce miscommunication, leading to smoother handoffs and less duplicated work. It provides us decision confidence, as data-driven insights from Comet allow us to make faster, more reliable decisions on model selections and deployment, which indirectly impacts project success and revenue.
Which other solutions did I evaluate?
Before choosing Comet, we evaluated a few more options in the market, including W&B, which is popular for experiment tracking and collaboration with strong visualization features. We also considered MLflow, an open-source platform for tracking experiments, models, and deployments that is flexible but requires more setup. We also experimented with Neptune.ai, which focuses on experiment logging and team collaboration and is lightweight and easy to use. We chose Comet because it offered a good balance between ease of use and advanced features, including experiment tracking, dashboards, and artifact management. It also has strong collaboration and access control capabilities for team and workflow. Additionally, it has reliable integration with our existing tech stack and major machine learning frameworks, which made it the preferred choice over the others.
What other advice do I have?
Collaboration in Comet is one of the strongest aspects for my team. Everyone in my team can access the same experiment dashboard and visualization, providing a single source of truth. Team members can leave context or explanation directly on runs, which helps avoid miscommunication and preserve knowledge over time. By tagging experiments, it becomes easier for multiple people to filter and find relevant runs quickly. Role-based permissions allow us to control who can add experiments versus who can only view them, which keeps collaboration secure.
Comet's documentation and learning resources are quite helpful and generally well-organized. The step-by-step guides and quick start tutorials made onboarding straightforward, especially for integrating with popular machine learning frameworks. The detailed documentation for the Python SDK, REST and APIs, and CLI makes it easier to implement custom logging, metrics, and artifacts tracking. The knowledge base and community forums provide practical solutions for common issues, which helps reduce downtime.
Comet handles version control for models, code, and data in a way that is very useful for my team. Every experiment run captures the code version, dataset version, and model checkpoints automatically, making it easy to reproduce results later. Models, datasets, and other artifacts are stored with clear lineage, so we can trace exactly which inputs produced a given output. We can compare different versions of models or experiments and, if required, roll back to previous stable versions without confusion. Additionally, Comet can link to Git commits, making code tracking seamless alongside experiments.
The advice I would offer to others looking into Comet is to start with the basics and then expand. Organizing experiments with tags, metadata, and inline notes from the start saves a lot of time and makes collaboration much smoother. Connect Comet with existing code repositories, CI and CD pipelines, and collaboration tools to get the most value. Use the shared dashboards to take more advantage of resources. If expecting many experiments or large datasets, structure projects and metadata thoughtfully to maintain performance and organization as usage grows. I would rate this product an 8 out of 10.
Experiment and asset tracking enhance model development and ease of on-prem maintenance
What is our primary use case?
I use Comet for experiment and asset tracking during model development, as well as to support model reproducibility and transparency. I also appreciate the ability to perform an on-prem installation without the need to maintain the installation.
How has it helped my organization?
Previously, we had an on-prem installation that required frequent re-deployment due to internal security standards, which could cause down-time during model development. Using Comet within SageMaker streamlined the deployment process to require zero maintenance and also simplified billing.
What is most valuable?
Model metric tracking and comparison has been extremely beneficial. Comet's customer service has also been excellent. Any issue we've had, they have been able to help us resolve.
What needs improvement?
SageMaker itself has a cumbersome interface, which makes launching Comet somewhat of a hassle.
For how long have I used the solution?
I have used the solution for 3 months.
Which deployment model are you using for this solution?
If public cloud, private cloud, or hybrid cloud, which cloud provider do you use?
Comet.ml: Streamlining Machine Learning and Collaborative Experiment Tracking Platform
Solid platform overall but there's competition
Easy to Use !! Great UI
Easy to use
Support different View and Easy to search Text
time take to pull data
small notification view
Application monitoring