DEV Community

Indal Kumar
Indal Kumar

Posted on

Building a Kafka Producer and Consumer in Go

Apache Kafka is a powerful distributed streaming platform used for building real-time data pipelines and streaming applications. In this blog post, we'll walk through setting up a Kafka producer and consumer using Golang.

Prerequisites

Before we begin, make sure you have the following installed on your machine:

  • Go (1.16 or higher)

  • Docker (for running Kafka locally)

  • Kafka

Setting Up Kafka with Docker

To quickly set up Kafka, we’ll use Docker. Create a docker-compose.yml file in your project directory:

yamlCopy codeversion: '3.7'
services:
  zookeeper:
    image: wurstmeister/zookeeper:3.4.6
    ports:
      - "2181:2181"
  kafka:
    image: wurstmeister/kafka:2.13-2.7.0
    ports:
      - "9092:9092"
    environment:
      KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://localhost:9092
      KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
    depends_on:
      - zookeeper
Enter fullscreen mode Exit fullscreen mode

Run the following command to start Kafka and Zookeeper:

docker-compose up -d
Enter fullscreen mode Exit fullscreen mode

Creating a Kafka Producer in Go

First, initialize a new Go module:

go mod init kafka-example
Enter fullscreen mode Exit fullscreen mode

Install the kafka-go library:

go get github.com/segmentio/kafka-go
Enter fullscreen mode Exit fullscreen mode

Now, create a file producer.go and add the following code:

package main

import (
    "context"
    "fmt"
    "github.com/segmentio/kafka-go"
    "log"
    "time"
)

func main() {
    writer := kafka.Writer{
        Addr:     kafka.TCP("localhost:9092"),
        Topic:    "example-topic",
        Balancer: &kafka.LeastBytes{},
    }

    defer writer.Close()

    for i := 0; i < 10; i++ {
        msg := kafka.Message{
            Key:   []byte(fmt.Sprintf("Key-%d", i)),
            Value: []byte(fmt.Sprintf("Hello Kafka %d", i)),
        }

        err := writer.WriteMessages(context.Background(), msg)
        if err != nil {
            log.Fatal("could not write message " + err.Error())
        }

        time.Sleep(1 * time.Second)
        fmt.Printf("Produced message: %s\n", msg.Value)
    }
}
Enter fullscreen mode Exit fullscreen mode

This code sets up a Kafka producer that sends ten messages to the example-topic topic.

Run the producer:

go run producer.go
Enter fullscreen mode Exit fullscreen mode

You should see output indicating that messages have been produced.

Creating a Kafka Consumer in Go

Create a file consumer.go and add the following code:

package main

import (
    "context"
    "fmt"
    "github.com/segmentio/kafka-go"
    "log"
)

func main() {
    reader := kafka.NewReader(kafka.ReaderConfig{
        Brokers: []string{"localhost:9092"},
        Topic:   "example-topic",
        GroupID: "example-group",
    })

    defer reader.Close()

    for {
        msg, err := reader.ReadMessage(context.Background())
        if err != nil {
            log.Fatal("could not read message " + err.Error())
        }
        fmt.Printf("Consumed message: %s\n", msg.Value)
    }
}
Enter fullscreen mode Exit fullscreen mode

This consumer reads messages from the example-topic topic and prints them to the console.

Run the consumer:

go run consumer.go
Enter fullscreen mode Exit fullscreen mode

You should see output indicating that messages have been consumed.

Conclusion

In this blog post, we demonstrated how to set up a Kafka producer and consumer using Golang. This simple example shows the basics of producing and consuming messages, but Kafka's capabilities extend far beyond this. With Kafka, you can build robust, scalable real-time data processing systems.

Feel free to explore more advanced features such as message partitioning, key-based message distribution, and integrating with other systems. Happy coding!


That's it! This blog post provides a concise introduction to using Kafka with Go, perfect for developers looking to get started with real-time data processing.

Top comments (0)