Kafka Consumers with Python

Learn via video courses
Topics Covered

Overview

This article will walk you through the entire process of building the Kafka Producer and Consumer in Python. We shall explore each Python Kafka consumer process, starting from setting up the environment to running Zookeeper and a Kafka server. We also create the producer, which shall push the messages in the Kafka log, which the Python Kafka consumer directly consumes.

Introduction

Apache Kafka can be defined as a highly distributed offering high throughput and excellent performance system. It is open-source software with a publish-subscribe messaging system and a full queue that can efficiently manage the massive volume of streaming data from its source to destination. Kafka offers a great way to read the data, storage, and analysis made with the streaming data. Built on top of the ZooKeeper synchronization service, Kafka easily integrates with other Apache technologies like Storm, Flink, Spark, etc. With its log commit partitioned architecture, Kafka can process real-time data streams.

In addition, as we all know, Python is a highly accepted depreciated programming language that many utilize for building seamless applications.

To learn more about Kafka, you can visit the link Kafka to explore the basics of Kafka and its various components.

Example of How to Use Kafka Consumers With Python

The below section will explain how we can implement Python Kafka consumers. Then, we shall go step by step into each stage to know how easily you can implement Python Kafka consumers.

Set up

To start with, we need to get the environment ready. The following list of packages must be pre-installed to implement Python Kafka consumers seamlessly.

Once the above packages are installed and configured in your system, start the Zookeeper and Kafka server. You first need to install Kafka and Zookeeper on your system individually.

Now, move towards installing Kafka-Python. If you are on the Anaconda distribution, use the pip or conda command.

It is always necessary to have the Zookeeper server started before the Kafka broker. Once you start executing the steps below for the Python Kafka consumer, the Zookeeper needs to correctly monitor the Kafka brokers. For our use case, Zookeeper runs on the localhost:2181, and Kafka executes on the localhost:9092 by default.

Users can execute the below command to start the Kafka server along with the ZooKeeper:

Create a Topic

For creating a topic, let's say we create a Kafka topic named sample_topic. Then, you can open a new command prompt to create the new topic via the …/kafka/bin/windows and run the below command:

Create a Python Producer

Let's start creating a Python producer for the Python Kafka consumer.

Execute the below command for creating the Python producer.

Write a Python Consumer

Now that the Python producer is ready, we should be writing the Python consumer to use it for consuming the messages from the Kafka topic.

The Kafka topic is the same one we created in the steps above.

Create a .env File

Create the .env file to write a Python consumer for the Python Kafka consumer.

Use the below command to implement the same.

Create File consumer.py

Once the .env file is ready, let's create the consumer.py file. Start by importing the necessary libraries stated below.

Import the below-given libraries:

It would help if you also had to below packages via executing the below-given commands.

Create a Consumer Class

Now let us start by creating the consumer class for our scenario of python kafka consumer.

Run the Consumer

Now, execute the below command for running the consumer file.

This way, you could send the messages from the producer and capture the messages to the consumer.

Consume Message from Consumer

Once the messages are produced by the producer and pushed into Kafka, you shall see that the Python Kafka consumer directly consumes the messages or data records.

Conclusion

  • Apache Kafka is open-source software with a publish-subscribe messaging system and a full queue that efficiently manages the massive volume of streaming data from its source to destination.
  • It is always necessary to have the Zookeeper server started before the Kafka broker.
  • In addition, Kafka offers a great way to read the data, storage, and analysis made with the streaming data.
  • The Python Kafka consumer directly consumes the message produced by the Kafka producer.