Introduction about Kafka In Python

Posted By : Anoop Sharma | 02-Dec-2020


Created by the Apache Software Foundation, Apache Kafka is an open-source stream-handling programming stage which is written in Scala and Java. The undertaking means to give a brought together, high-throughput, low-inactivity stage for taking care of continuous information takes care of. Kafka can associate with outside frameworks (for information import/send out) through Kafka Connect and gives Kafka Streams, a Java stream handling library. Kafka utilizes a paired TCP-based convention that is upgraded for effectiveness and depends on a "message set" deliberation that normally bunches messages together to decrease the overhead of the system roundtrip. Kafka receives permission through the following, "prompts bigger system bundles, bigger consecutive plate activities, adjoining memory squares [...], to transform a bursty stream of irregular message composes into straight composes.


Also Read: Useful Python Packages For Django To Simplify Web Development


Kafka Setup on your machinekafka Setup on your machine

Requirement:   Your local environment must have Java 8+ installed.

  • Downloading Kafka latest release on your machine, Kafka Latest Release
  • Extract the Kafka release  on your machine

    $ tar -xzf kafka_2.13-2.6.0.tgz
    $ cd kafka_2.13-2.6.0
  • Start the Kafka Environment by performing the following steps.

    # Start the ZooKeeper service
    $ bin/ config/

    On a different terminal but on the same directory, type the following

    # Start the Kafka broker service
    $ bin/ config/

    Once they both run, Your Kafka environment is up and ready to be used.
  • Create Topics to store your events

    we have created a topic “test”
    $ bin/ --create --topic test --bootstrap-server localhost:9092


Also Read: Top 5 Python App Development Frameworks In 2020


Python Implementation to send and receive data in Kafka Environment

Producer Code:


    def produce_kafka_streams(feeding_data,topic_name, kafka_stream_type_json):
        # Adding a sleeping time to the producer so that consumer can get started 
        # to read the streamed data via producer 
        print("KAFKA PRODUCER INITIATION") #value_serializer=lambda v: json.dumps(v).encode('utf-8')
        if not kafka_stream_type_json:
            producer = KafkaProducer(bootstrap_servers=KAFKA_HOSTS, api_version=KAFKA_VERSION)
            producer = KafkaProducer(bootstrap_servers=KAFKA_HOSTS, api_version=KAFKA_VERSION, value_serializer=lambda v: json.dumps(v).encode('utf-8'))
        # Sending the data on Kafka
        data_sent = producer.send(topic_name, feeding_data)
        # Assigning New value to the global varaible
        result = data_sent.get(timeout=60)
        print("KAFKA PRODUCER STOPPED for topic: {}".format(topic_name))


Consumer Code for consuming the data producer by the Consumer 


    def consume_kafka_streams(topic_name):
        print("Started to lookout to consume")
        #Creating a consumer to read the streamed data from Kafka
        consumer = KafkaConsumer(topic_name)  

        for msg in consumer:


In this way, you can run these scripts and send the data from Kafka and receive it via Python


Keep Coding!!


Are you planning to launch your Python-based financial service application? We are a Python App development company that designs and delivers banking apps using Python. Get in touch with us to know more!

Related Tags

About Author

Author Image
Anoop Sharma

Anoop is a Python developer, who has worked on Python Framework Django and is keen to increase his skillset in the field. He has a zest for learning and is capable of handling challenges. He is a team player and has good enthusiasm.

Request for Proposal

Name is required

Comment is required

Sending message..