Learn Apache Kafka 2021 – Best Apache Kafka courses & Best Apache Kafka tutorials

Best Apache Kafka Courses 2021

[ufwp id=”1075642,1294188,1141696″ template=”grid”]

Best Apache Kafka Tutorials 2021

Apache Kafka Series – Learn Apache Kafka for Beginners v2

[ufwp id=”1075642″]

Welcome to the Apache Kafka series! Join a community of over 20,000 students learning Kafka. Apache Kafka has become the leading enterprise big data technology for distributed data streaming. Kafka is used in production by over 33% of Fortune 500 companies such as Netflix, Airbnb, Uber, Walmart, and LinkedIn.

To learn Kafka easily, step by step, you have come to the right place! No prior knowledge of Kafka is required

If you look at the documentation you can see that Apache Kafka is not easy to learn …

Thanks to my many years of experience in Kafka and Big Data, I wanted to make learning Kafka accessible to everyone.

We will take a step-by-step approach to learning all the fundamentals of Apache Kafka.
By the end of this course, you will be productive and know the following:

The architecture of the Apache Kafka ecosystem

The basic concepts of Kafka: topics, scores, brokers, lines, producers, consumers and more!

Launch your own Kafka cluster in no time using native Kafka binaries – Windows / MacOS X / Linux

Learn and practice using the Kafka Command Line Interface (CLI)

Code producer and consumers using the Java API

Real world project using Twitter as a source of data for a producer and Elasticsearch as a sink for our consumer

Overview of advanced APIs (Kafka Connect, Kafka Streams)

Real-world case studies and big use cases

Introducing Advanced Kafka for Administrators

Advanced topic configurations

Appendices (start a Kafka cluster locally, using Docker, etc …)

Note: The how-to section is based on Java, which is Kafka’s native programming language. But good news! Your learning in Java will be fully applicable to other programming languages, such as Python, C #, Node.js or Scala, and to Big Data frameworks such as Spark, NiFi or Akka

You will learn:

Understand the ecosystem, architecture, basic concepts and operations of Apache Kafka
Basic concepts such as topics, scores, brokers, producers, consumers
Start a personal Kafka development environment
Learn the main CLIs: kafka-topics, kafka-console-producer, kafka-console-consumer, kafka-consumer-groups, kafka-configs
Create your producers and consumers in Java to interact with Kafka
Schedule a real-world Twitter producer and Elasticsearch consumer
Extensive overview of APIs (Kafka Connect, Kafka Streams), case studies and Big Data architecture
Practice and understand log compaction

Apache Kafka Series – Kafka Streams for Data Processing

[ufwp id=”1294188″]

Discover the Kafka Streams data processing library, for Apache Kafka. Join hundreds of savvy students to learn one of the most promising data processing libraries on Apache Kafka. Kafka Streams is the easiest way to write your apps on Kafka:

> The easiest way to transform your data using high level DSL
> Support for Exactly Once out of the box semantics!
> Deploy and upgrade your Kafka Streams application without a cluster!
> Perform aggregations, joins, and whatever else you can think of using just a few lines of code!
> Built on Kafka, for fault tolerance, scalability and resilience

Note: This course is based on Java 8 and will include an example in Scala. Kafka Streams is based on Java and therefore not suitable for any other programming language.

Each section can be either theoretical or a practical section.

> Through practice, you will be challenged in writing your own Kafka Streams app. The solutions will be explained in detail and you will learn some tips on the best way to use Kafka Streams.

> Through theory, you will learn about all available APIs, the inner workings of the library, as well as some exciting concepts such as the Exactly Once semantics!

This course is the first and only Kafka Streams course available on the web. Get it now to become a Kafka expert!

Section plan:

Kafka Streams – First Look: Let’s launch Kafka and run your first Kafka Streams application, WordCount

End-to-end Kafka Streams app: Write code for WordCount, import dependencies, build and package your app, and learn how to scale it. This is a complete end-to-end example

KStream and KTable Simple Operations: Discover all the stateless operations available for KStream and KTable API

Practice Exercise – Favorite Color: Practice your newly learned skills by writing your own Kafka Streams, Favorite Color app. It will be difficult! Includes a Scala version of the example

KStream and KTable Advanced Operations: Discover all the stateful operations available for KStream and KTable API

Exactly Once Semantics – Theory: Find out what EOS (Exactly Once Semantics) is, how Kafka 0.11 activates it and how to activate it in Kafka Streams

Exactly Once – Practice Exercise – Bank Balance: Put into practice your new knowledge by writing your own Kafka Streams Exactly Once application, to calculate an outstanding bank balance for your customers

Testing your Kafka Streams application: Learn how to test WordCount Kafka Streams topology with Kafka Streams v1.1.0

Apache Kafka Series – Kafka Connect Hands-on Learning

[ufwp id=”1141696″]

Kafka Connect is a scalable and reliable data dissemination tool between Apache Kafka and other data systems. Apache Kafka Connect is a common framework for Apache Kafka producers and consumers.

Apache Kafka Connect offers an API, Runtime, and REST service to allow developers to define connectors that move large data sets to and from Apache Kafka in real time. It inherits strong concepts such as fault tolerance and elasticity through being an extension of Apache Kafka. Kafka Connect can ingest entire databases, collect metrics, collect logs from all your application servers in Apache Kafka topics, making data available for stream processing with low latency.

Kafka Connect standardizes the integration of other data systems with Apache Kafka, simplifying the development, deployment and management of connectors.

In this course, we will learn how to deploy, configure, and manage Kafka Connector with hands-on exercises. We’ll also see distributed and standalone modes evolve into a large, centrally managed service supporting an entire organization or move to development, testing, and small production deployments. The REST interface to submit and manage connectors to your Kafka Connect cluster through an easy to use REST API.

Overview of course content –

Section 1 – Course Introduction: In this section we will see the prerequisites required for this course and for Apache Kafka Connect. We will also talk about the objectives and structure of the course.

Section 2 – Apache Kafka Connect Concepts: In this section we will find out what Kafka Connect is, Apache Kafka Connect architecture, we will talk about connectors, configuration, tasks, workers. We will also learn the difference between standalone mode and distributed mode of Kafka Connect.

Section 3 – Setting Up and Launching the Kafka Connect Cluster: In these sections we will learn how to install Docker on your machine and get started with Apache Kafka Connect in the easiest way possible using Docker Compose.

Section 4 – Apache Kafka Connect Data Source – Practice: In this section, we will gain hands-on experience with the Kafka Connect data source. We will learn the architecture of the Kafka Connect data source and the list of available connectors. We will have some hands-on practice and learning on File Stream Source Connector both stand-alone and distributed, and on Twitter Source Connectors… !!!

Section 5 – Apache Kafka Connect Data Sink – Practice: In this section, we will gain hands-on experience on Kafka Connect Data Sink. We are going to learn the Kafka Connect Data Sink architecture, the Apache Kafka Connect REST APIs and we will have some practical exercises and learning on Elastic Search Sink Connector and on JDBC Sink Connectors… !!!

Section 6 – Next Steps: In this section we will wrap up the course and see what is the next step you can take.

As an Amazon Associate I earn from qualifying purchases.