Skip to content

Commit ecb45cc

Browse files
authored
Merge pull request #76324 from mamccrea/userstory1455270
Stream Analytics: streaming decisions
2 parents 33e3ade + 98bf4a5 commit ecb45cc

File tree

2 files changed

+59
-0
lines changed

2 files changed

+59
-0
lines changed

articles/stream-analytics/TOC.yml

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -34,6 +34,8 @@
3434
href: https://azure.microsoft.com/resources/samples/?service=stream-analytics
3535
- name: Concepts
3636
items:
37+
- name: Choose a streaming analytics technology
38+
href: streaming-technologies.md
3739
- name: Input types for a job
3840
items:
3941
- name: Inputs overview
Lines changed: 57 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,57 @@
1+
---
2+
title: Choose a real-time analytics and streaming processing technology on Azure
3+
description: Learn about how to choose the right real-time analytics and streaming processing technology to build your application on Azure.
4+
author: zhongc
5+
ms.author: zhongc
6+
ms.reviewer: mamccrea
7+
ms.service: stream-analytics
8+
ms.topic: conceptual
9+
ms.date: 05/15/2019
10+
---
11+
12+
# Choose a real-time analytics and streaming processing technology on Azure
13+
14+
There are several services available for real-time analytics and streaming processing on Azure. This article provides the information you need to decide which technology is the best fit for your application.
15+
16+
## When to use Azure Stream Analytics
17+
18+
Azure Stream Analytics is the recommended service for stream analytics on Azure. It's meant for a wide range of scenarios that include but aren't limited to:
19+
20+
* Dashboards for data visualization
21+
* Real-time [alerts](stream-analytics-set-up-alerts.md) from temporal and spatial patterns or anomalies
22+
* Extract, Transform, Load (ETL)
23+
* [Event Sourcing pattern](/azure/architecture/patterns/event-sourcing.md)
24+
* [IoT Edge](stream-analytics-edge.md)
25+
26+
Adding an Azure Stream Analytics job to your application is the fastest way to get streaming analytics up and running in Azure, using the SQL language you already know. Azure Stream Analytics is a job service, so you don't have to spend time managing clusters, and you don't have to worry about downtime with a 99.9% SLA at the job level. Billing is also done at the job level making startup costs low (one Streaming Unit), but scalable (up to 192 Streaming Units). It's much more cost effective to run a few Stream Analytics jobs than it is to run and maintain a cluster.
27+
28+
Azure Stream Analytics has a rich out-of-the-box experience. You can immediately take advantage of the following features without any additional setup:
29+
30+
* Built-in temporal operators, such as [windowed aggregates](stream-analytics-window-functions.md), temporal joins, and temporal analytic functions.
31+
* Native Azure [input](stream-analytics-add-inputs.md) and [output](stream-analytics-define-outputs.md) adapters
32+
* Support for slow changing [reference data](stream-analytics-use-reference-data.md) (also known as a lookup tables), including joining with geospatial reference data for geofencing.
33+
* Integrated solutions, such as [Anomaly Detection](stream-analytics-machine-learning-anomaly-detection.md)
34+
* Multiple time windows in the same query
35+
* Ability to compose multiple temporal operators in arbitrary sequences.
36+
* Under 100-ms end-to-end latency from input arriving at Event Hubs, to output landing in Event Hubs, including the network delay from and to Event Hubs, at sustained high throughput
37+
38+
## When to use other technologies
39+
40+
### You need to input from or output to Kafka
41+
42+
Azure Stream Analytics doesn't have an Apache Kafka input or output adapter. If you have events landing in or need to send to Kafka and you don't have a requirement to run your own Kafka cluster, you can continue to use Stream Analytics by sending events to Event Hubs using the Event Hubs Kafka API without changing the event sender. If you do need to run your own Kafka cluster, you can use Spark Structured Streaming, which is fully supported on [Azure Databricks](../azure-databricks/index.yml), or Storm on [Azure HDInsight](../hdinsight/storm/apache-storm-tutorial-get-started-linux.md).
43+
44+
### You want to write UDFs, UDAs, and custom deserializers in a language other than JavaScript or C#
45+
46+
Azure Stream Analytics supports user-defined functions (UDF) or user-defined aggregates (UDA) in JavaScript for cloud jobs and C# for IoT Edge jobs. C# user-defined deserializers are also supported. If you want to implement a deserializer, a UDF, or a UDA in other languages, such as Java or Python, you can use Spark Structured Streaming. You can also run the Event Hubs **EventProcessorHost** on your own virtual machines to do arbitrary streaming processing.
47+
48+
### Your solution is in a multi-cloud or on-premises environment
49+
50+
Azure Stream Analytics is Microsoft's proprietary technology and is only available on Azure. If you need your solution to be portable across Clouds or on-premises, consider open-source technologies such as Spark Structured Streaming or Storm.
51+
52+
## Next steps
53+
54+
* [Create a Stream Analytics job by using the Azure portal](stream-analytics-quick-create-portal.md)
55+
* [Create a Stream Analytics job by using Azure PowerShell](stream-analytics-quick-create-powershell.md)
56+
* [Create a Stream Analytics job by using Visual Studio](stream-analytics-quick-create-vs.md)
57+
* [Create a Stream Analytics job by using Visual Studio Code](quick-create-vs-code.md)

0 commit comments

Comments
 (0)