Course overview:
In this workshop we will look deeper into the architecture of Kafka in order to understand how we can get the best performance. During the course of this workshop we will build some applications in order to highlight the different performance outcomes based on the way we design the application.
Objectives
- Understand how Kafka works and what are its main components
- Understand how records are produced and consumed
- Understand how to analyze and transform records
- Learn the different models that can be used to improve performance for different scenarios
Target audience
Developers, Technical Product Managers, QA and other technical roles that are already using or are planning to use Kafka in a new project. The audience should be familiar with the terminology in Kafka, but not experts. The aim of this workshop is to lift the knowledge of people that are interested in Kafka to the next level.
Technical requirements
- Working JDK 11 installation
- Working Docker installation
- Maven
Duration: 1 day
Agenda:
Deep dive into Kafka
- Get to know the different components of a Kafka system
- Become familiar with the terminology
Basic production and consumption
- Create a basic producer
- Create a basic consumer
A deeper look at record production
- Multithreaded production
- Producing custom datatypes
- Understanding the different production delivery guarantees and their performance implications
Keeping things in balance
- Understanding record consumption
- Offset management
- Collaborative consumption, understanding consumer groups