官术网_书友最值得收藏!

NRT – high-level system view

The previous section of this chapter is dedicated to providing you with an understanding of the basic building blocks of an NRT application and its logical overview. The next step is to understand the functional and systems view of the NRT architectural framework. The following figure clearly outlines the various architectural blocks and cross cutting concerns:

So, if I get to describe the system as a horizontal scale from left to right, the process starts with data ingestion and transformation in near real-time using low-latency components. The transformed data is passed on to the next logical unit that actually performs highly optimized and parallel operations on the data; this unit is actually the near real-time processing engine. Once the data has been aggregated and correlated and actionable insights have been derived, it is passed on to the presenting layer, which along with real-time dash boarding and visualization, may have a persistence component that retains the data for long term deep analytics.

The cross cutting concerns that exist across all the components of the NRT framework as depicted in the previous figure are:

  • Security
  • System management
  • Data integrity and management

Next, we are going to get you acquainted with four basic streaming patterns, so you are acquainted with the common flavors that streaming use cases pose and their optimal solutions (in later sections):

  • Stream ingestion: Here, all we are expected to do is to persist the events to the stable storage, such as HDFS, HBase, Solr, and so on. So all we need are low-latency stream collection, transformation, and persistence components.
  • Near real-time (NRT) processing: This application design allows for an external context and addresses complex use cases such as anomaly or fraud detection. It requires filtering, alerting, de-duplication, and transformation of events based on specific sophisticated business logic. All these operations are required to be performed at extremely low latency.
  • NRT event partitioned processing: This is very close to NRT processing, but with a variation that helps it deriving benefits from partitioning the data, to quote a few instances, it is like storing more relevant external information in memory. This pattern also operates at extremely low latencies.
  • NRT and complex models/machine learning: This one mostly requires us to execute very complex models/operations over a sliding window of time over the set of events in the stream. They are highly complex operations, requiring micro batching of data and operate over very low latencies.
主站蜘蛛池模板: 合作市| 新蔡县| 深泽县| 禄丰县| 涪陵区| 汉中市| 岫岩| 伊通| 大足县| 丽水市| 溧水县| 迁安市| 扶沟县| 昌江| 施甸县| 闽清县| 大宁县| 瓮安县| 枣庄市| 娄烦县| 雷州市| 辽源市| 西乡县| 丹东市| 南澳县| 安康市| 宁都县| 千阳县| 大石桥市| 琼海市| 揭西县| 修文县| 阿瓦提县| 嫩江县| 内乡县| 资兴市| 鸡泽县| 东丰县| 廊坊市| 伊春市| 和顺县|