官术网_书友最值得收藏!

NRT – high-level system view

The previous section of this chapter is dedicated to providing you with an understanding of the basic building blocks of an NRT application and its logical overview. The next step is to understand the functional and systems view of the NRT architectural framework. The following figure clearly outlines the various architectural blocks and cross cutting concerns:

So, if I get to describe the system as a horizontal scale from left to right, the process starts with data ingestion and transformation in near real-time using low-latency components. The transformed data is passed on to the next logical unit that actually performs highly optimized and parallel operations on the data; this unit is actually the near real-time processing engine. Once the data has been aggregated and correlated and actionable insights have been derived, it is passed on to the presenting layer, which along with real-time dash boarding and visualization, may have a persistence component that retains the data for long term deep analytics.

The cross cutting concerns that exist across all the components of the NRT framework as depicted in the previous figure are:

  • Security
  • System management
  • Data integrity and management

Next, we are going to get you acquainted with four basic streaming patterns, so you are acquainted with the common flavors that streaming use cases pose and their optimal solutions (in later sections):

  • Stream ingestion: Here, all we are expected to do is to persist the events to the stable storage, such as HDFS, HBase, Solr, and so on. So all we need are low-latency stream collection, transformation, and persistence components.
  • Near real-time (NRT) processing: This application design allows for an external context and addresses complex use cases such as anomaly or fraud detection. It requires filtering, alerting, de-duplication, and transformation of events based on specific sophisticated business logic. All these operations are required to be performed at extremely low latency.
  • NRT event partitioned processing: This is very close to NRT processing, but with a variation that helps it deriving benefits from partitioning the data, to quote a few instances, it is like storing more relevant external information in memory. This pattern also operates at extremely low latencies.
  • NRT and complex models/machine learning: This one mostly requires us to execute very complex models/operations over a sliding window of time over the set of events in the stream. They are highly complex operations, requiring micro batching of data and operate over very low latencies.
主站蜘蛛池模板: 乐都县| 阳曲县| 伊宁县| 安阳市| 合川市| 林芝县| 临沭县| 莱州市| 焦作市| 璧山县| 苏尼特右旗| 北安市| 肃南| 桐梓县| 吉隆县| 许昌县| 吕梁市| 兴海县| 莆田市| 教育| 廉江市| 清丰县| 温宿县| 宿迁市| 德钦县| 亳州市| 台北县| 潮安县| 台北县| 嘉荫县| 陆川县| 德江县| 平罗县| 繁峙县| 武义县| 奉新县| 遵化市| 凤台县| 无锡市| 静乐县| 天镇县|