官术网_书友最值得收藏!

Deep Dive into Kafka Consumers

Every messaging system has two types of data flows. One flow pushes the data to the Kafka queues and the other flow reads the data from those queues. In the previous chapter, our focus was on the data flows that are pushing the data to Kafka queues using producer APIs. After reading the previous chapter, you should have sufficient knowledge about publishing data to Kafka queues using producer APIs in your application. In this chapter, our focus is on the second type of data flow--reading the data from Kafka queues.

Before we start with a deep dive into Kafka consumers, you should have a clear understanding of the fact that reading data from Kafka queues involves understanding many different concepts and they may differ from reading data from traditional queuing systems.

With Kafka, every consumer has a unique identity and they are in full control of how they want to read data from each Kafka topic partition. Every consumer has its own consumer offset that is maintained in Zookeeper and they set it to the next location when they read data from a Kafka topic.

In this chapter, we will cover different concepts of Kafka consumers. Overall, this chapter covers how to consume messages from Kafka systems along with Kafka consumer APIs and their usage. It will walk you through some examples of using Kafka consumer APIs with Java and Scala programming languages and take a deep dive with you into consumer message flows along with some of the common patterns of consuming messages from Kafka topics.

We will cover the following topics in this chapter:

  • Kafka consumer internals
  • Kafka consumer APIs
  • Java Kafka consumer example
  • Scala Kafka consumer example
  • Common message consuming patterns
  • Best practices
主站蜘蛛池模板: 洞头县| 通州市| 宁晋县| 百色市| 阿图什市| 乌兰察布市| 隆德县| 萨嘎县| 班戈县| 清水县| 秭归县| 柳林县| 忻城县| 盘山县| 常州市| 金溪县| 峨眉山市| 许昌县| 通山县| 定西市| 牙克石市| 柘城县| 徐汇区| 佛山市| 泾阳县| 琼中| 基隆市| 广水市| 伊吾县| 蒙山县| 六盘水市| 鄄城县| 湘西| 高青县| 格尔木市| 拉萨市| 特克斯县| 巍山| 朝阳区| 樟树市| 重庆市|