- Stream Analytics with Microsoft Azure
- Anindita Basak Krishna Venkataraman Ryan Murphy Manpreet Singh
- 860字
- 2021-07-02 22:35:55
Introduction to Azure Stream Analytics
Microsoft Azure Stream Analytics falls into the category of PaaS services where the customers don't need to manage the underlying infrastructure. However, they are still responsible and manage an application that builds on the top of PaaS service and more importantly the customer data.
Azure Stream Analytics is a fully managed server-less PaaS service that is built for real-time analytics computations on streaming data. The service can consume from a multitude of sources. Azure will take care of the hosting, scaling, and management of the underlying hardware and software ecosystem. The following are some of the examples of different use cases for Azure Stream Analytics.
When we are designing the solution that involves streaming data, in almost every case, Azure Stream Analytics will be part of a larger solution that the customer was trying to deploy. This can be real-time dashboarding for monitoring purposes or real-time monitoring of IT infrastructure equipment, preventive maintenance (auto-manufacturing, vending machines, and so on), and fraud detection. This means that the streaming solution needs to be thoughtful about providing out-of-the-box integration with a whole plethora of services that could help build a solution in a relatively quick fashion.
Let's review a usage pattern for Azure Stream Analytics using a canonical model:

Azure Stream Analytics using a canonical model
We can see devices and applications that generate data on the left in the preceding illustration that can connect directly or through cloud gateways to your stream ingest sources. Azure Stream Analytics can pick up the data from these ingest sources, augment it with reference data, run necessary analytics, gather insights and push them downstream for action. You can trigger business processes, write the data to a database or directly view the anomalies on a dashboard.
In the previous canonical pattern, the number of streaming ingest technologies are used; let's review them in the following section:
- Event Hub: Global scale event ingestion system, where one can publish events from millions of sensors and applications. This will guarantee that as soon as an event comes in here, a subscriber can pick that event up within a few milliseconds. You can have one or more subscriber as well depending on your business requirements. A typical use case for an Event Hub is real-time financial fraud detection and social media sentiment analytics.
- IoT Hub: IoT Hub is very similar to Event Hub but takes the concept a lot further forward—in that you can take bidirectional actions. It will not only ingest data from sensors in real time but can also send commands back to them. It also enables you to do things like device management. Enabling fundamental aspects such as security is a primary need for IoT built with it.
- Azure Blob: Azure Blob is a massively scalable object storage for unstructured data, and is accessible through HTTP or HTTPS. Blob storage can expose data publicly to the world or store application data privately.
- Reference Data: This is auxiliary data that is either static or that changes slowly. Reference data can be used to enrich incoming data to perform correlation and lookups.

Using reference data in a streaming solution
On the ingress side, with a few clicks, you can connect to Event Hub, IOT Hub, or Blob storage. The Streaming data can be enriched with Reference data in the Blob store.
Data from the ingress process will be consumed by the Azure Stream Analytics service; we can call machine learning (ML) for event scoring in real time. The data can be egressed to live Dashboarding to Power BI, or could also push data back to Event Hub from where dashboards and reports can pick it up.
The following is a summary of the ingress, egress, and archiving options:
- Ingress choices:
- Event Hub
- IoT Hub
- Blob storage
- Egress choices:
- Live Dashboards:
- Power BI
- Event Hub
- Live Dashboards:
- Driving workflows:
- Event Hubs
- Service Bus
- Archiving and post analysis:
- Blob storage
- Document DB
- Data Lake
- SQL Server
- Table storage
- Azure Functions
One key point to note is there the number of customers who push data from Stream Analytics processing (egress point) to Event Hub and then add Azure website-as hosted solutions into their own custom dashboard. One can drive workflows by pushing the events to Azure Service Bus and Power BI.
For example, a customer can build IoT support solutions to detect an anomaly in connected appliances and pushing the result into Azure Service Bus. A worker role can run as a daemon to pull the messages and create support tickets using Dynamics CRM API. Then use Power BI on the ticket can be archived for post analysis. This solution eliminates the need for the customer to log a ticket, but the system will automatically do it based on predefined anomaly thresholds. This is just one sample of real-time connected solution.
There are a number of use cases that don't even involve real-time alerts. You can also use it to aggregate data, filter data, and store it in Blob storage, Azure Data Lake (ADL), Document DB, SQL, and then run U-SQL Azure Data Lake Analytics (ADLA), HDInsight, or even call ML models for things like predictive maintenance.
- 傳感器技術(shù)實驗教程
- Dreamweaver CS3網(wǎng)頁設(shè)計50例
- 精通Excel VBA
- 小型電動機(jī)實用設(shè)計手冊
- DevOps:Continuous Delivery,Integration,and Deployment with DevOps
- Hybrid Cloud for Architects
- Apache Superset Quick Start Guide
- 走近大數(shù)據(jù)
- Dreamweaver CS6精彩網(wǎng)頁制作與網(wǎng)站建設(shè)
- 計算機(jī)組網(wǎng)技術(shù)
- 手機(jī)游戲程序開發(fā)
- Pentaho Analytics for MongoDB
- 嵌入式Linux系統(tǒng)實用開發(fā)
- 智能鼠原理與制作(進(jìn)階篇)
- 筆記本電腦電路分析與故障診斷