Spring Boot + Apache Kafka Tutorial - #8 - Create REST API to Send Message


Welcome to Spring Boot + Apache Kafka Tutorial series. In this lecture, we will create REST API to send messages to the Kafka producer, and the Kafka producer intern publishes that message to the Kafka topic.

Lecture - #8 - Create REST API to Send Message

Transcript:

Hi. Welcome back. In this lecture, we will create a REST API to send a message to Kafka Producer. Well, if we can see the Kafka architecture over here in the previous lecture, we have created a Kafka producer. Right? In this lecture, we will create a REST API to send a message to the Kafka producer, and then Kafka Producer will send that message to the Kafka topic. Okay, let's head over to the IntelliJ idea, and let's create one simple REST API to send a message from the client. Well, let me create a new package, and let me call it a controller. Within a controller package, Let's create a class and let's name it as MessageController Hit enter and let's annotate this controller class with @RestController annotation in order to make this class as it spring MVC REST controller and let's also use @RequestMapping annotation to define the base URL for the REST API. So here I am going to use /api/v1/kafka. Well, within the MessageController, we're going to create a rest endpoint so before that, we need to inject the Kafka Producer so let's private KafkaProducer kafkaProducer so we're going to use a constructor basically dependency injection in order to inject KafkaProducer. So let me create a constructor over here. Generate constructor and okay, so here we're not going to use @Autowired annotation, okay because if Spring bean has only one constructor then we can avoid using @Autowired annotation. Okay. You can use @Autowired annotation but Spring 4.2 onwards we can ignore this annotation if the spring bean contains only one parameterized constructor. As you can see in our case this message controller spring bean has only one parameterized constructor so Spring IOC will by default inject this dependency. Okay great now let's create the REST endpoint let's say public and then let's use a response entity as a response type and let's pass string as a type and then let's say send message or let's say publish as the method name and then string message as the method parameter. Let's annotate this method with @GetMapping annotation and let's pass the URI something like /publish. And then in order to, you know, get the value from the URL, we're going to use query parameters. So let's use @RequestParam annotation in order to get the value from the query parameter and let's provide a key as a message Okay. So we're going to basically retrieve a value from the query parameter that's why we're using here @RequestParam annotation. And the key that we are going to define a query parameter is a message. Okay, something like this. So let me create REST endpoint URL for you. Let's say a http:localhost:8080/api/v1/kafka/publish and then in order to define the query parameter, we use a question mark followed by that query parameter key that is a message and then we provide a value, let's say Hello World. Okay, so this is the message that we have provided over here as a key and in spring we use @RequestParam annotation to retrieve value of this query parameter. Okay. and that value will be stored in this message. All right, great. Now let's go ahead and let's call the Kafka producer in order to pass this message. So here I am going to call KafkaProducer and then call it method, sendMessage and then pass the message to it and simply return response entity dot Ok and in response, just pass like a message sent to the topic something like this. Now we have built a simple REST API that will publish a message to KafkaProducer and Kafka Producer internally use a Kafka template to send that message to the Kafka topic. Now let's run the Spring Boot application and let's call this REST API to send a message to the topic. Well, let me run the Spring Boot application from here and you can see Spring Boot application is running on embedded tomcat server on port 8080. Now let's go to the browser and from the IntelliJ idea let me simply copy this URL and paste it into a browser and hit enter and there we go. Well, REST API has returned the response message sent to the topic. It means that the REST API that we have built to send a message to the Kafka topic is working as expected right. Now, let's go ahead and let's verify whether this hello world message is successfully sent to the Kafka topic or not. In order to do that, let's go to Kafka quick quickstart page over here and let's go to the steps. So let's go to step three. Create a topic to store your events. We have already created a topic in a Kafka cluster using Spring Boot application. Let's move to the next step. Write some events into the topic. Well, we have already created a Kafka producer and we have already created a simple API to send a, you know, message to the topic. So let let's move to the next step. Read the events or messages from the topic. Well, we use this command in order to verify whether the message is written to the Kafka topic or not. Okay. So this is basically a Kafka provided sh file It acts as basically consumer. So let me simply copy this command and let me execute it in a terminal so that we can able to see whatever the, you know, messages that we have sent to the Kafka topic. So let's go to terminal again and in a terminal open a new cell, if you are using windows, then make sure that you open a windows, you know, a command prompt. So just go to Downloads folder so we have Kafka in Downloads folder right So cd downloads, cd Kafka from here we simply paste the command. Well, within this command we need to change the topic name. Right here you can see the topic name quickstart-events but we are given a topic name as javaguides right? So let's go ahead and let's change this topic name to javaguides and hit enter and here you can see. So these are the messages that are there in a Kafka topic. Okay. And we have sent hello world message, right? So now we can able to read that Hello, world message from the Kafka topic. Well, let me send one more message. So let's say let's say test Kafka message in a topic something like this and hit enter and let's go to terminal and there we go test Kafka message in a topic. So it means that Kafka producer has sent a message to the Kafka topic and Kafka consumer consumes that message from the Kafka topic. Well, let's go ahead and let's send one more message to the topic. Let's say Kafka topic one, two, three, something like that, and go back to the terminal and if you are able to see Kafka's topic successfully read from the Kafka topic. So this kafka-console-consumer.sh file So this sh file we can use to simply read the events or messages from the Kafka topic. Okay, now we have seen how to read events or messages from the topic by using the command line. In the next lecture, we'll create a Kafka consumer to consume the message from the Consumer. All right, I will see you in the next lecture.

Comments