Skip to content

Commit 69006d0

Browse files
committed
Acrolynx
1 parent 8fb1a28 commit 69006d0

File tree

1 file changed

+6
-6
lines changed

1 file changed

+6
-6
lines changed

articles/stream-analytics/stream-analytics-solution-patterns.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -27,7 +27,7 @@ This solution pattern offers the lowest latency from the event source to the Pow
2727

2828
The Power BI dashboard offers low latency, but you can't use it to produce full fledged Power BI reports. A common reporting pattern is to output your data to SQL Database first. Then use Power BI's SQL connector to query SQL for the latest data.
2929

30-
:::image type="content" source="media/stream-analytics-solution-patterns/sql-dashboard.png" alt-text="Diagram that shows SQL Database as an intermidate store between Stream Analytics and Power BI dashboard.":::
30+
:::image type="content" source="media/stream-analytics-solution-patterns/sql-dashboard.png" alt-text="Diagram that shows SQL Database as an intermediate store between Stream Analytics and Power BI dashboard.":::
3131

3232
When you use SQL Database, it gives you more flexibility but at the expense of a slightly higher latency. This solution is optimal for jobs with latency requirements greater than one second. When you use this method, you can maximize Power BI capabilities to further slice and dice the data for reports, and much more visualization options. You also gain the flexibility of using other dashboard solutions, such as Tableau.
3333

@@ -45,7 +45,7 @@ Azure Event Hubs service, on the other hand, offers the most flexible integratio
4545

4646
## Dynamic applications and websites
4747

48-
You can create custom real-time visualizations, such as dashboard or map visualization, using Azure Stream Analytics and Azure SignalR Service. Using SignalR, web clients can be updated and show dynamic content in real-time.
48+
You can create custom real-time visualizations, such as dashboard or map visualization, using Azure Stream Analytics and Azure SignalR Service. When you use SignalR, web clients can be updated and show dynamic content in real-time.
4949

5050
:::image type="content" source="media/stream-analytics-solution-patterns/dynamic-app.png" alt-text="Diagram that shows a Web app using SignalR service as a destination.":::
5151

@@ -55,13 +55,13 @@ Most web services and web applications today use a request/response pattern to s
5555

5656
High data volume often creates performance bottlenecks in a CRUD-based system. The [event sourcing solution pattern](/azure/architecture/patterns/event-sourcing) is used to address the performance bottlenecks. Temporal patterns and insights are also difficult and inefficient to extract from a traditional data store. Modern high-volume data driven applications often adopt a dataflow-based architecture. Azure Stream Analytics as the compute engine for data in motion is a linchpin in that architecture.
5757

58-
:::image type="content" source="media/stream-analytics-solution-patterns/event-sourcing-app.png" alt-text="Diagram that shows a real-time application as a destination for a Stream Analytis job.":::
58+
:::image type="content" source="media/stream-analytics-solution-patterns/event-sourcing-app.png" alt-text="Diagram that shows a real-time application as a destination for a Stream Analytics job.":::
5959

6060
In this solution pattern, events are processed and aggregated into data stores by Azure Stream Analytics. The application layer interacts with data stores using the traditional request/response pattern. Because of Stream Analytics' ability to process a large number of events in real-time, the application is highly scalable without the need to bulk up the data store layer. The data store layer is essentially a materialized view in the system. [Azure Stream Analytics output to Azure Cosmos DB](stream-analytics-documentdb-output.md) describes how Azure Cosmos DB is used as a Stream Analytics output.
6161

6262
In real applications where processing logic is complex and there's the need to upgrade certain parts of the logic independently, multiple Stream Analytics jobs can be composed together with Event Hubs as the intermediary event broker.
6363

64-
:::image type="content" source="media/stream-analytics-solution-patterns/event-sourcing-app-complex.png" alt-text="Diagram that shows Event Hubs as an intermediatory and a real-time application as a destination for a Stream Analytis job.":::
64+
:::image type="content" source="media/stream-analytics-solution-patterns/event-sourcing-app-complex.png" alt-text="Diagram that shows Event Hubs as an intermediary and a real-time application as a destination for a Stream Analytics job.":::
6565

6666
This pattern improves the resiliency and manageability of the system. However, even though Stream Analytics guarantees exactly once processing, there's a small chance that duplicate events land in the intermediary Event Hubs. It's important for the downstream Stream Analytics job to dedupe events using logic keys in a lookback window. For more information on event delivery, see [Event Delivery Guarantees](/stream-analytics-query/event-delivery-guarantees-azure-stream-analytics) reference.
6767

@@ -85,13 +85,13 @@ For advanced users who want to incorporate online training and scoring into the
8585

8686
Another common pattern is real-time data warehousing, also called streaming data warehouse. In addition to events arriving at Event Hubs and IoT Hub from your application, [Azure Stream Analytics running on IoT Edge](stream-analytics-edge.md) can be used to fulfill data cleansing, data reduction, and data store and forward needs. Stream Analytics running on IoT Edge can gracefully handle bandwidth limitation and connectivity issues in the system. Stream Analytics can support throughput rates of upto 200 MB/sec while writing to Azure Synapse Analytics.
8787

88-
:::image type="content" source="media/stream-analytics-solution-patterns/data-warehousing.png" alt-text="Diagram that shows real-time data wearhouse a destination for a Stream Analytics job.":::
88+
:::image type="content" source="media/stream-analytics-solution-patterns/data-warehousing.png" alt-text="Diagram that shows real-time data warehouse a destination for a Stream Analytics job.":::
8989

9090
## Archiving real-time data for analytics
9191

9292
Most data science and analytics activities still happen offline. You can archive data in Azure Stream Analytics through Azure Data Lake Store Gen2 output and Parquet output formats. This capability removes the friction to feed data directly into Azure Data Lake Analytics, Azure Databricks, and Azure HDInsight. Azure Stream Analytics is used as a near real-time Extract-Transform-Load (ETL) engine in this solution. You can explore archived data in Data Lake using various compute engines.
9393

94-
:::image type="content" source="media/stream-analytics-solution-patterns/offline-analytics.png" alt-text="Diagram that shows archiving of real-time data from an Stream Analytics job.":::
94+
:::image type="content" source="media/stream-analytics-solution-patterns/offline-analytics.png" alt-text="Diagram that shows archiving of real-time data from a Stream Analytics job.":::
9595

9696
## Use reference data for enrichment
9797

0 commit comments

Comments
 (0)