官术网_书友最值得收藏!

Deep Dive into Kafka Consumers

Every messaging system has two types of data flows. One flow pushes the data to the Kafka queues and the other flow reads the data from those queues. In the previous chapter, our focus was on the data flows that are pushing the data to Kafka queues using producer APIs. After reading the previous chapter, you should have sufficient knowledge about publishing data to Kafka queues using producer APIs in your application. In this chapter, our focus is on the second type of data flow--reading the data from Kafka queues.

Before we start with a deep dive into Kafka consumers, you should have a clear understanding of the fact that reading data from Kafka queues involves understanding many different concepts and they may differ from reading data from traditional queuing systems.

With Kafka, every consumer has a unique identity and they are in full control of how they want to read data from each Kafka topic partition. Every consumer has its own consumer offset that is maintained in Zookeeper and they set it to the next location when they read data from a Kafka topic.

In this chapter, we will cover different concepts of Kafka consumers. Overall, this chapter covers how to consume messages from Kafka systems along with Kafka consumer APIs and their usage. It will walk you through some examples of using Kafka consumer APIs with Java and Scala programming languages and take a deep dive with you into consumer message flows along with some of the common patterns of consuming messages from Kafka topics.

We will cover the following topics in this chapter:

  • Kafka consumer internals
  • Kafka consumer APIs
  • Java Kafka consumer example
  • Scala Kafka consumer example
  • Common message consuming patterns
  • Best practices
主站蜘蛛池模板: 舟曲县| 大关县| 遂溪县| 鲁山县| 凭祥市| 安宁市| 乌什县| 聂荣县| 加查县| 南丹县| 阜宁县| 北辰区| 达孜县| 诏安县| 五原县| 周至县| 周至县| 济阳县| 德令哈市| 简阳市| 松原市| 松桃| 拉孜县| 温州市| 兰州市| 吐鲁番市| 神木县| 台州市| 化州市| 常熟市| 乌拉特中旗| 邯郸市| 叙永县| 车致| 长垣县| 永靖县| 和龙市| 通道| 兰西县| 韶关市| 嘉义县|