quick-guides

Bootstrapping Kafka and managing topics in 2 minutes

This is a short and straightforward guide that will help you start a Strimzi Kafka service locally, using containers. After it’s started, you’ll also see how to create and list the existing topics, and publish and consume events from these topics.

Pre-requisite: You’ll only need a container runtime like Docker or Podman.

In this guide, the examples are using Docker, and we’ll achieve the goals by using the command line.

Special thanks to Hugo Guerrero for the strimzi-all-in-one project that allows us to easily start the required services all together.

Starting Kafka

1. First, you need this docker-compose file in your machine. 

git clone https://github.com/hguerrero/amq-examples

This docker-compose file should bootstrap everything you need to have your Strimzi up and running: Zookeeper, Kafka server v2.5.0, Apicurio Registry and a Kafka Bridge.

2. Open the folder strimzi-all-in-one folder

cd amq-examples/strimzi-all-in-one/

3. Start the Kafka environment:

docker-compose up 

Done. You now have a Kafka server running on localhost port 9092. 

TIP: The first time you run it, Docker will download the images you don’t have yet. In my machine, it took it up to 2,5min to download and start all the services. Once I had the images locally, it usually starts in less than five seconds.

Creating Topics

Still in the strimzi-all-in-one folder, you can create new topics by using the following command. In this sample name of the topic is “my-topic”. 

docker-compose exec kafka bin/kafka-topics.sh --create --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1 --topic my-topic

Listing topics

You can list all the topics in this server:

docker-compose exec kafka bin/kafka-topics.sh --list --bootstrap-server localhost:9092

Publishing events to topics

To publish events, you can simply use the publisher script available in Kafka. This example posts a json in “my-topic” topic we created before.

docker-compose exec kafka bin/kafka-console-producer.sh --topic my-topic --bootstrap-server localhost:9092

It will start and wait for data input. You could send for example:

{"data" : { "description": "test message", "priority": "low"}}

Consuming events from topics

You can consume the events that were posted on specific topics. Let’s consume the events from “my-topic“:

docker-compose exec kafka bin/kafka-console-consumer.sh --topic my-topic --from-beginning --bootstrap-server localhost:9092

Deleting a topic

Finally, delete “my-topic” from this Kafka server:

docker-compose exec kafka bin/kafka-topics.sh --delete --bootstrap-server localhost:9092 --topic my-topic

With this, you have now in your machine a basic application infrastructure to get started playing with Kafka and event streaming.

2 thoughts on “Bootstrapping Kafka and managing topics in 2 minutes

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s