Confluent

Confluent provides a commercial platform for managing continuous data streams using Apache Kafka. Its certifications validate skills in infrastructure administration and developing applications that produce or consume real-time data.

2Exams

Available Exams

The Business of Data in Motion

In 2014, the original creators of Apache Kafka left LinkedIn to build a commercial platform around their open-source project. They founded Confluent to help enterprises manage continuous data streams rather than static batches.

The company grew by solving a specific enterprise problem: raw Apache Kafka is notoriously difficult to scale, secure, and maintain. Confluent provided the management layer and cloud-native infrastructure that large organizations required.

Continue Reading

By late 2025, Confluent generated over $1.1 billion in annual subscription revenue and entered into an $11 billion acquisition agreement with IBM. Over 1,500 enterprises spend more than $100,000 annually on Confluent software. These organizations rely on technical staff who understand how to configure, deploy, and build applications upon this specific infrastructure.

Confluent Certification Paths

Unlike major cloud providers with multi-tiered credential systems, Confluent keeps its certification program narrow. The vendor offers two primary exams targeting the two main ways IT professionals interact with Kafka: administration and development.

Both exams run for 90 minutes and contain roughly 60 multiple-choice and multiple-select questions. Confluent does not officially publish the required passing score, though successful candidates generally report needing to answer around 70 to 75 percent of the questions correctly.

For Infrastructure Professionals: CCAAK

The CCAAK (Confluent Certified Administrator for Apache Kafka) proves you can build and maintain the underlying data streaming infrastructure.

Kafka clusters require precise tuning to prevent bottlenecks. This exam tests your knowledge of broker configuration, deployment architecture, and cluster scaling. You must demonstrate competence in securing the environment through authentication and encryption, as well as setting up observability metrics to catch latency issues before they cause system failures.

Candidates also need to understand the evolution of Kafka's architecture. Recent versions of Kafka shifted away from Zookeeper for cluster metadata management, replacing it with KRaft (Kafka Raft). The exam evaluates your understanding of these consensus protocols and how they affect cluster stability.

For Software Engineers: CCDAK

The CCDAK (Confluent Certified Developer for Apache Kafka) targets software engineers building applications that produce or consume real-time data.

Passing this exam requires a deep understanding of Kafka's core mechanics. You need to know how producers serialize and partition data, and how consumers read from topics using offset management and consumer groups.

The test also covers the broader ecosystem. You will answer questions about Kafka Streams for real-time data processing and Kafka Connect for integrating with external databases. Employers look for this credential when hiring engineers to design data pipelines that cannot drop messages or process them out of order.

Market Value and Practical Application

Kafka skills carry a premium in the job market, but massive enterprise environments rarely run raw open-source Kafka on their own hardware anymore. They run Confluent Platform on-premises or Confluent Cloud.

Holding a CCAAK or CCDAK signals to hiring managers that you understand enterprise-grade data streaming. A developer with a CCDAK knows how to write efficient producer applications that will not overwhelm the brokers with network requests. An administrator with a CCAAK knows how to configure partition replication so a sudden hardware failure does not result in permanent data loss.

As organizations increasingly feed real-time data into artificial intelligence models, the underlying streaming infrastructure becomes a critical point of failure. A model generating responses based on outdated information holds little value. Companies investing millions in real-time AI context rely on certified professionals to keep those data pipelines flowing without interruption.