Listing Thumbnail

    Apache Kafka® & Apache Flink® on Confluent Cloud™ - Annual Commits

     Info
    Sold by: Confluent 
    Deployed on AWS
    Vendor Insights
    Empower your business with our real-time data streaming platform, driven by Apache Kafka and Flink. Engineered for elastic scaling and robust security, simplify AI and bring stream processing to workloads everywhere. Achieve smarter decisions with unmatched flexibility and speed, ensuring you stay ahead in an increasingly data-driven world. Confluent Cloud is a resilient, scalable, streaming data service based on Apache Kafka and delivered as a fully-managed service. Set your data in motion and enhance your business infrastructure with Confluent on Amazon Web Services (AWS).

    Overview

    Play video

    Confluent is a cloud-native, data streaming platform that enables you to stream, connect, process, and govern all of your data, allowing you to streamline AI workloads and accelerate data-driven decisions.

    Confluent has re-architected Apache Kafka for AWS, offering a fully managed, elastic, and globally available service that seamlessly connects with AWS services, security, management, and billing, while allowing you to cost-effectively retain data at any scale with Infinite Storage. With Kora, the cloud-native Kafka engine, you can achieve 10x better performance at 60% lower cost, along with unmatched scalability and reliability.

    Our serverless Apache Flink service, lets you easily leverage stream processing to filter, join, aggregate, and enrich your streams in real-time before sharing the data broadly to downstream systems and applications.

    We make it easy for you to connect and migrate on-premises, multi-cloud, and edge data to AWS. With 120+ pre-built connectors, you can build real-time streaming pipelines with just a few clicks to Amazon S3, Amazon Redshift, Amazon Kinesis, Amazon Relational Database Service (RDS), Amazon DynamoDB, and more. Also, deeper integrations with our Connect with Confluent (CwC) partners, such as AWS Lambda, can be found directly within the AWS console, making it easier for their customers to use data streams.

    Commits allow you to purchase Confluent Cloud via annual commitments with usage discounts. If you are working with a Confluent sales representative, please get in touch with your sales representative prior to placing an order on this page. There will be no refunds once an order is placed. To request a custom private quote, please contact awsteam@confluent.io . Go to our Pay As You Go page to get started without commitments.

    Highlights

    • Cloud-Native: Confluent's cloud-native, complete data streaming platform powered by the Kora Engine is re-architected for the cloud delivering elastic, resilient, cost efficient, and performant event streaming capabilities.
    • Complete: Go above and beyond Kafka to build real-time apps quickly, reliably, and securely with pre-built and fully managed connectors, stream governance, serverless stream processing with Apache Flink, built-in management & monitoring, and enterprise-grade security.
    • Everywhere: Connect and migrate on-prem, multicloud, and edge data to AWS, and seamlessly link everything together in real time to create a consistent data layer across your business with Cluster Linking.

    Details

    Delivery method

    Deployed on AWS

    Unlock automation with AI agent solutions

    Fast-track AI initiatives with agents, tools, and solutions from AWS Partners.
    AI Agents

    Features and programs

    Trust Center

    Trust Center
    Access real-time vendor security and compliance information through their Trust Center powered by Drata. Review certifications and security standards before purchase.

    Buyer guide

    Gain valuable insights from real users who purchased this product, powered by PeerSpot.
    Buyer guide

    Financing for AWS Marketplace purchases

    AWS Marketplace now accepts line of credit payments through the PNC Vendor Finance program. This program is available to select AWS customers in the US, excluding NV, NC, ND, TN, & VT.
    Financing for AWS Marketplace purchases

    AWS PrivateLink

    Get next level security. Connect VPCs and AWS services without exposing data to the internet.

    Vendor Insights

     Info
    Skip the manual risk assessment. Get verified and regularly updated security info on this product with Vendor Insights.
    Security credentials achieved
    (6)

    Pricing

    Apache Kafka® & Apache Flink® on Confluent Cloud™ - Annual Commits

     Info
    Pricing is based on the duration and terms of your contract with the vendor, and additional usage. You pay upfront or in installments according to your contract terms with the vendor. This entitles you to a specified quantity of use for the contract duration. Usage-based pricing is in effect for overages or additional usage not covered in the contract. These charges are applied on top of the contract price. If you choose not to renew or replace your contract before the contract end date, access to your entitlements will expire.
    Additional AWS infrastructure costs may apply. Use the AWS Pricing Calculator  to estimate your infrastructure costs.

    12-month contract (1)

     Info
    Dimension
    Description
    Cost/12 months
    Commit
    Your Total Contract Value
    $25,000.00

    Additional usage costs (1)

     Info

    The following dimensions are not included in the contract terms, which will be charged based on your usage.

    Dimension
    Cost/unit
    Confluent Consumption Unit
    $0.01

    AI Insights

     Info

    Dimensions summary

    Confluent's pricing on AWS Marketplace is structured around two key components: a committed spend through a 12-month contract and usage-based billing using Confluent Consumption Units (CCUs). The commit dimension represents the total contract value a customer agrees to spend over the contract period. CCU representes a translation or conversion of total $ spend net of discount on various Confluent Cloud resources such as CKUs, throughput, storage, as well as other services like connector tasks, CFUs, schema charge etc. The total $ spent net of discounts is converted to CCUs using the listed rate of $0.01/CCU on the listing. Once the committed amount is fully consumed, any additional usage will be will be converted and reported to the AWS account in CCUs.

    Top-of-mind questions for buyers like you

    What is a Confluent Consumption Unit (CCU) and how is it measured?
    CCU representes a translation or conversion of total $ spend net of discount on various Confluent Cloud resources such as CKUs, throughput, storage, as well as other services like connector tasks, CFUs, schema charge etc. The total $ spent net of discounts is converted to CCUs using the listed rate of $0.01/CCU on the listing.
    How does the 12-month commit pricing work with Confluent?
    The 12-month commit represents a predetermined spending amount that customers agree to over the contract period. This commitment provides access to Confluent's full platform capabilities, with the flexibility to consume resources up to the committed amount, while any usage beyond the commitment is billed at the standard CCU rate.
    Are there any additional costs to consider beyond the commit and overages?
    All of Confluent Cloud related charges will be billed in CCUs. These include the commitment amount and any overages beyond it if incurred. Confluent charges includes both Confluent's software charges and the underlying AWS resources used to run the service. However, Customers may incur additional AWS infrastructure costs for services or resources that are running in relation to Confluent. These charges will show up on their AWS Cost and Usage report.

    Vendor refund policy

    Please contact us at awsteam@confluent.io 

    Custom pricing options

    Request a private offer to receive a custom quote.

    How can we make this page better?

    We'd like to hear your feedback and ideas on how to improve this page.
    We'd like to hear your feedback and ideas on how to improve this page.

    Legal

    Vendor terms and conditions

    Upon subscribing to this product, you must acknowledge and agree to the terms and conditions outlined in the vendor's End User License Agreement (EULA) .

    Content disclaimer

    Vendors are responsible for their product descriptions and other product content. AWS does not warrant that vendors' product descriptions or other product content are accurate, complete, reliable, current, or error-free.

    Usage information

     Info

    Delivery details

    Software as a Service (SaaS)

    SaaS delivers cloud-based software applications directly to customers over the internet. You can access these applications through a subscription model. You will pay recurring monthly usage fees through your AWS bill, while AWS handles deployment and infrastructure management, ensuring scalability, reliability, and seamless integration with other AWS services.

    Support

    Vendor support

    To learn more about our support offerings please visit: confluent.io/confluent-cloud/support/ Technical assistance from the foremost Apache Kafka experts with over 1 million hours of expertise with a paid support plan. Support plans can be added to your subscription directly from the Confluent Cloud web UI. Support portal accessible within the Confluent Cloud web UI

    AWS infrastructure support

    AWS Support is a one-on-one, fast-response support channel that is staffed 24x7x365 with experienced and technical support engineers. The service helps customers of all sizes and technical abilities to successfully utilize the products and features provided by Amazon Web Services.

    Product comparison

     Info
    Updated weekly

    Accolades

     Info
    Top
    10
    In Data Integration, Streaming solutions, Analytic Platforms
    Top
    10
    In Storage, Streaming solutions
    Top
    10
    In Streaming solutions, Data Warehouses

    Customer reviews

     Info
    Sentiment is AI generated from actual customer reviews on AWS and G2
    Reviews
    Functionality
    Ease of use
    Customer service
    Cost effectiveness
    Positive reviews
    Mixed reviews
    Negative reviews

    Overview

     Info
    AI generated from product descriptions
    Data Streaming Platform
    Cloud-native data streaming platform powered by Kora engine for event streaming capabilities
    Stream Processing
    Serverless Apache Flink service for real-time stream filtering, joining, aggregation, and data enrichment
    Data Connectivity
    Supports 120+ pre-built connectors for seamless integration with AWS services like S3, Redshift, Kinesis, RDS, and DynamoDB
    Cloud Architecture
    Fully managed, elastic, and globally available service with enhanced performance and scalability
    Multi-Environment Integration
    Enables connection and migration of on-premises, multi-cloud, and edge data with consistent data layer across environments
    Data Streaming Compatibility
    Fully compatible with Kafka APIs, supporting extensive pre-built connectors for real-time data streaming
    Cluster Management
    Automated cluster operations including zero-downtime upgrades, data balancing, and partition management
    Connector Ecosystem
    Built-in fully managed connectors supporting integration with popular data systems like Snowflake, MongoDB, S3, MySQL, and PostgreSQL
    Performance Optimization
    Designed to maximize hardware performance potential, delivering high throughput and low latency streaming capabilities
    Storage Management
    Tiered storage architecture automatically migrates data from brokers to object storage for efficient long-term data retention
    Multi-Technology Support
    Comprehensive platform supporting multiple open-source data technologies including PostgreSQL, Apache Kafka, OpenSearch, Apache Flink, Cassandra, ClickHouse, MySQL, and Redis
    High Availability Infrastructure
    Highly available self-healing platform with 99.99% uptime SLA and near-zero downtime during scaling and upgrading
    Security Compliance
    Supports multiple security standards including ISO 27001, SOC 2, PCI-DSS, GDPR, and HIPAA compliance
    Data Protection Mechanism
    Data encrypted in transit and at rest with Virtual Private Cloud (VPC) peering capabilities
    Infrastructure Management
    Supports infrastructure-as-code tools like Terraform and provides deployment through console, CLI, REST API, and provider interfaces

    Security credentials

     Info
    Validated by AWS Marketplace
    FedRAMP
    GDPR
    HIPAA
    ISO/IEC 27001
    PCI DSS
    SOC 2 Type 2
    -
    No security profile
    No security profile

    Contract

     Info
    Standard contract
    No
    No
    No

    Customer reviews

    Ratings and reviews

     Info
    4
    3 ratings
    5 star
    4 star
    3 star
    2 star
    1 star
    0%
    100%
    0%
    0%
    0%
    3 AWS reviews
    |
    125 external reviews
    Star ratings include only reviews from verified AWS customers. External reviews can also include a star rating, but star ratings from external reviews are not averaged in with the AWS customer star ratings.
    reviewer2711799

    Enhancement of message distribution and security through diverse connection support

    Reviewed on May 28, 2025
    Review from a verified AWS customer

    What is our primary use case?

    My main use cases for Apache Kafka  are for a big project where Apache Kafka  is one of the main components for sending messages and receiving them.

    What is most valuable?

    I appreciate that Apache Kafka is fast and secure thanks to implementing it with AWS , allowing me to secure it on a high level. It's fast with a good connection, and the different types of connections are a good thing for me, helping our team that uses it, which is very helpful.

    The impact of Apache Kafka's scalability features on my organization and data processing capabilities depends on how many messages each company wants to receive. With a high throughput, it helps to have more brokers and partitions. If you are a company that doesn't need that many messages, I won't say it will help you a lot, but on the other hand, it can change significantly.

    What needs improvement?

    I don't actually think about anything they could improve about Apache Kafka, as our use cases using it are more or less on the basic level, so I didn't think about any kind of improvement.

    For personal preferences, since we use Managed Kafka in AWS , I would appreciate having some kind of UI integrated into Apache Kafka for connecting to it because using code to connect it is basic, but we can use a UI.

    For how long have I used the solution?

    I have been using Apache Kafka for about six months.

    What do I think about the stability of the solution?

    I use Apache Kafka topic partitioning feature for my system stability.

    This feature of Apache Kafka has helped enhance our system stability when handling high volume data because we have thousands of messages in a small amount of time, so partitioning helps us distribute all the messages that we receive between all partitions, which helps us to be stable.

    What was our ROI?

    I have seen some ROI from Apache Kafka, although I can't recall specifics.

    Which other solutions did I evaluate?

    I am saying that Apache Kafka has better security than other options, even though I don't know about them because we didn't explore them. We simply knew that Apache Kafka is the base where you should use it, so we went with that.

    What other advice do I have?

    At this point, I don't have any specific examples to share. I don't actually remember if I have used Apache Kafka Connect for integrating various data sources and sinks within my organization. I currently don't have any examples of how it has benefited my organization.

    On a scale of one to ten, I would rate Apache Kafka an eight.

    Which deployment model are you using for this solution?

    Public Cloud

    If public cloud, private cloud, or hybrid cloud, which cloud provider do you use?

    Amazon Web Services (AWS)
    NakulBali

    Effective real-time data streaming but benefits from improved user interface

    Reviewed on Mar 31, 2025
    Review provided by PeerSpot

    What is our primary use case?

    We used Apache Kafka  to receive information in streams for Telco projects. As customers bought airtime, we received that information via Apache Kafka  streams.

    What is most valuable?

    There is a thin line between a normal queuing system, such as ActiveMQ  or RabbitMQ, and a streaming system like Apache Kafka. Apache Kafka is effective when dealing with large volumes of data flowing at high speeds, requiring real-time processing. It was useful for us in receiving constant recharge information for customers. Queuing systems, however, excel at providing acknowledgment of messages, which is not a feature of streaming systems.

    What needs improvement?

    I haven't explored its features extensively enough to suggest improvements. A more user-friendly interface and better management consoles with improved documentation could be beneficial.

    For how long have I used the solution?

    We used Apache Kafka for about a year. The project is on hold now, so we are not actively using it.

    What was my experience with deployment of the solution?

    We did not face significant challenges while integrating Apache Kafka with our existing systems. The implementation was straightforward.

    How are customer service and support?

    I have never had to use the technical support as Apache Kafka is a very open-source system. There is plenty of community support available online.

    How would you rate customer service and support?

    Which solution did I use previously and why did I switch?

    We switched from using API gateways to an in-house integration platform. However, before Apache Kafka, we used RabbitMQ.

    How was the initial setup?

    The initial setup took approximately a couple of weeks. The system installation was already existing, and we just had to tie into a specific topic.

    What was our ROI?

    We haven't seen a return on investment with Apache Kafka. It's used for a specific use case rather than cost reduction.

    What's my experience with pricing, setup cost, and licensing?

    Its pricing is reasonable. It's not always about cost, but about meeting specific needs.

    Which other solutions did I evaluate?

    RabbitMQ is not a direct competitor to Apache Kafka, as both have their specific domains and use cases.

    What other advice do I have?

    It is crucial to understand your use case before deciding on a solution. While I can replace RabbitMQ with Apache Kafka, acknowledgment of messages in queuing systems makes them preferable for certain applications like building a payment gateway system. I would rate Apache Kafka seven or eight out of ten.

    If public cloud, private cloud, or hybrid cloud, which cloud provider do you use?

    Other
    Snehasish Das

    Data streaming transforms real-time data movement with impressive scalability

    Reviewed on Dec 26, 2024
    Review provided by PeerSpot

    What is our primary use case?

    I worked with Apache Kafka  for customers in the financial industry and OTT platforms. They use Kafka particularly for data streaming. Companies offering movie and entertainment as a service, similar to Netflix, use Kafka.

    What is most valuable?

    Apache Kafka  offers unique data streaming. It allows the use of data in motion, allowing data to propagate from one source to another while it is in motion. This is valuable when data is not simply residing in a database.

    What needs improvement?

    In the data sharing space, the performance of Apache Kafka could be improved. The performance angle is critical, and while it works in milliseconds, the goal is to move towards microseconds.

    For how long have I used the solution?

    I started working with Kafka about five years ago while at a financial company.

    What do I think about the stability of the solution?

    Apache Kafka is stable. Even though enterprises often use the open-source version, there are minimal issues after configuration.

    What do I think about the scalability of the solution?

    Apache Kafka is very scalable. I would rate its scalability as nine out of ten. Customers have not faced issues with user growth or data streaming needs.

    How are customer service and support?

    The Apache community provides support for the open-source version. Despite being open-source, extensive documentation is available to resolve issues.

    How would you rate customer service and support?

    Positive

    How was the initial setup?

    The initial setup of Apache Kafka is straightforward, around an eight on a scale from one to ten. The deployment process involves configuring the publisher, subscriber, and other parameters. SaaS can be deployed from the cloud in a couple of hours.

    What about the implementation team?

    Since I work with the open-source version of Kafka, solutions are managed internally with the Apache documentation.

    What's my experience with pricing, setup cost, and licensing?

    The open-source version of Apache Kafka results in minimal costs, mainly linked to accessing documentation and limited support. Enterprises usually opt for the more cost-effective open-source edition.

    What other advice do I have?

    For critical business components, it is advisable to use Confluent-managed services for Kafka. However, for non-critical functions, the open-source version is sufficient. 

    Overall, I rate Apache Kafka as nine out of ten for its scalability and stability.

    Which deployment model are you using for this solution?

    Public Cloud

    If public cloud, private cloud, or hybrid cloud, which cloud provider do you use?

    Other
    Kemal Duman

    Achieves real-time data management with fast and fault-tolerant solutions

    Reviewed on Dec 02, 2024
    Review provided by PeerSpot

    What is our primary use case?

    We are always using Apache Kafka  for our real-time scenarios. It helps us detect anomalies and attacks on our website through machine learning models.

    What is most valuable?

    We are managing our data by topics. Splitting topics is more effective for us. Apache Kafka  is very fast and stable. It offers scalability with ease and also integrates well with our tools. Fault tolerance is a good feature, and it also has high throughput rates.

    What needs improvement?

    Config management can be better. We are always trying to find the best configs, which is a challenge.

    For how long have I used the solution?

    I have been working with Apache Kafka for more than four years. It has been used since the beginning of our department, maybe six years.

    What do I think about the stability of the solution?

    It is very stable and meets our needs consistently.

    What do I think about the scalability of the solution?

    If there is latency, our Kubernetes  admin includes our Kafka nodes to increase scalability. Kafka provides flexibility and integrates easily with Kubernetes .

    Which solution did I use previously and why did I switch?

    Before Apache Airflow , I used Cron Tab. However, Apache Airflow  makes it easy to follow and manage tasks, and data science departments can easily build their models or pipelines using it.

    What other advice do I have?

    I would rate Apache Kafka nine out of ten.

    Which deployment model are you using for this solution?

    On-premises
    Rotem Fogel

    Transforms data with efficient real-time analytics and has robust streaming capabilities

    Reviewed on Nov 18, 2024
    Review provided by PeerSpot

    What is our primary use case?

    Currently, I work for an observability company. We stream customer data into our cloud, digest the information, enrich it, transform it, save it, and use on-the-fly aggregation with Kafka. Previously, I worked for a security company doing normal detection using streaming with Kafka. 

    I also worked for a company with a data platform based on Kafka, where we ingested clickstream data and enriched it before streaming.

    What is most valuable?

    The most valuable feature of Kafka is the Kafka Streams client. Unlike other systems like Flink  or Spark Streaming , you don't need a separate engine to do real-time transformations and analytics. The amount of data that can be streamed into the platform and the scalability are also significant benefits.

    What needs improvement?

    Kafka requires fine-tuning to find the best architecture, number of nodes, and partitions for your use case. It’s a trial-and-error process with no one-size-fits-all solution. Issues may arise until it’s appropriately tuned. 

    While it can scale out efficiently, scaling down is more challenging, making deleting data or reducing activity harder.

    For how long have I used the solution?

    I have been working with the Kafka product for more than ten years.

    What do I think about the stability of the solution?

    Since Kafka is written in Java, it's not as stable as it should be on the JVM. The stability depends on fine-tuning the system to find the best architecture for your use case. However, the replication factor helps avoid data loss despite the stability issues.

    What do I think about the scalability of the solution?

    Kafka's architecture allows for scalability by adding nodes and partitions to topics. However, it's not as effective in scaling in, making reducing activity and deleting data harder. 

    Scalability can be managed both manually and automatically to meet demands.

    Which solution did I use previously and why did I switch?

    I used to work with Spark Streaming  and Flink , however, not in the past year.

    How was the initial setup?

    If you are unfamiliar with Kafka, setting up the cluster can be quite difficult. You need to understand the architecture and components and compute the data volume upfront. For experienced individuals, the setup is less difficult yet still requires preparation.

    What was our ROI?

    From a time-saving perspective, onboarding new customers is straightforward, requiring them merely to stream their data into our platform.

    What's my experience with pricing, setup cost, and licensing?

    We use Apache Kafka , which is open-source, so we don't have fees. I can't comment on ownership costs as I am not responsible for that domain.

    Which other solutions did I evaluate?

    Apart from Kafka, I have experience working with Spark Streaming and Flink.

    What other advice do I have?

    When implementing Kafka, it's important to plan the cluster size upfront to ensure easy scalability. Adding or removing nodes can disrupt the clusters, so proper sizing and planning are key. 

    I would rate Kafka as a solution as a nine.

    View all reviews