Standard
Data

Event Streaming with Kafka

Level: Associate

Learn to build real-time systems that react instantly to streaming data with Apache Kafka

Course Duration: 4.22 Hours
Event Streaming with Kafka
User profile

Raghunandan Sanur

Senior Data Platform Engineer

Imagine building systems that react instantly to streams of data—fraud alerts triggered within milliseconds, dashboards pulsing with live updates, and digital platforms scaling to millions of users in real time. In this hands-on course on Apache Kafka, you'll learn the basics of these capabilities to life. Through a dynamic mix of real-world examples, interactive labs, and practical demonstrations, you’ll progress from the basics of event streaming to deploying advanced, production-ready architectures.

Course Outline:

Foundations of Event Streaming

Begin with a solid understanding of event streaming concepts and event-driven architecture, exploring how Apache Kafka powers real-time data flows in industries like finance and beyond. You’ll examine critical use cases, deploy your own Kafka cluster and user interface using Docker, and create your first Kafka topic.

Building Blocks of Kafka

Dive into Kafka’s core architecture, including brokers, topics, partitions, and replication, learning how these elements work together to store, organize, and reliably deliver streaming data.

Kafka Producers & Consumers: The Message Flow

Understand how producers and consumers form the backbone of Kafka's event pipeline. Learn message serialization, the significance of message keys, and reliability strategies like acknowledgments and consumer groups. Reinforce concepts by building and configuring producers and consumers, exploring advanced operations like rebalancing.

Deep Dive into Kafka: Beyond the Basics

Advance your skills by looking at Kafka’s offset management, error handling (including poison pill scenarios), and the roles of ZooKeeper and KRaft for cluster management. 

Confluent Kafka and its Offerings

Explore the operational complexities of running Kafka at scale, and discover how Confluent Cloud simplifies management and deployment for cloud-native event streaming.

Kafka Connect: Effortless Data Pipelines

Learn how Kafka Connect enables seamless streaming of data to and from external systems. Build your own pipeline to stream Kafka data to Amazon S3, gaining practical insights into connector configuration and deployment.

Building an Event-Driven System

Design and implement a complete event-driven architecture utilizing Kafka, from infrastructure setup on AWS EC2 to frontend and backend integration. Consolidate your skills with a capstone lab, bringing together all course concepts for real-world application in an interactive environment.

Target Audience:

This course is ideal for software developers, DevOps engineers, data engineers, and technical leads eager to design, deploy, and manage real-time, event-driven systems using industry-leading technologies.

By the end of this course, you’ll be fully equipped to leverage Kafka and related tools for real-time data streaming, unlocking new levels of innovation and efficiency for your projects and organization.

Our students work at..

Vmware logo
Microsoft logo
Google logo
Dell logo
Apple logo
Pivotal logo
Amazon logo

About the instructor

  • Raghunandan Sanur

    Raghunandan Sanur

    Senior Data Platform Engineer

    Raghunandana has 8+ years of IT experience in Data platforms and Data Engineering. He is closely related to the DevOps tech stack and likes to bridge the gap between the Data and the DevOps world. He has worked in different sectors and startup ecosystems in the past years and has 7+ years of training experience focused on learning concepts based on tech/businesses use cases giving the best understanding of why we learn a tool or a tech stack.

Course Content