You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/stream-analytics/quick-create-azure-cli.md
+30-85Lines changed: 30 additions & 85 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -9,12 +9,11 @@ ms.reviewer: jasonh
9
9
ms.workload: big-data
10
10
ms.topic: quickstart
11
11
ms.custom: mvc, devx-track-azurecli, mode-api
12
-
ms.date: 07/01/2020
12
+
ms.date: 02/28/2023
13
13
---
14
14
15
15
# Quickstart: Create an Azure Stream Analytics job using the Azure CLI
16
-
17
-
In this quickstart, you use the Azure CLI to define a Stream Analytics job that filters real-time sensor messages with a temperature reading greater than 27. Your Stream Analytics job will read data from IoT Hub, transform the data, and write the data back to a container in blob storage. The input data used in this quickstart is generated by a Raspberry Pi online simulator.
16
+
In this quickstart, you will use Azure CLI to define a Stream Analytics job that filters real-time sensor messages with a temperature reading greater than 27. The Stream Analytics job reads data from IoT Hub, transforms the data, and writes the data back to a container in a blob storage. The input data used in this quickstart is generated by a Raspberry Pi online simulator.
18
17
19
18
## Before you begin
20
19
@@ -36,7 +35,7 @@ Before you define the Stream Analytics job, prepare the data that's used for the
36
35
37
36
The following Azure CLI code blocks are commands that prepare the input data required by the job. Review the sections to understand the code.
38
37
39
-
1. Create an IoT Hub using the [az iot hub create](/cli/azure/iot/hub#az-iot-hub-create) command. This example creates an IoT Hub called **MyASAIoTHub**. Because IoT Hub names are unique, you need to come up with your own IoT Hub name. Set the SKU to F1 to use the free tier if it is available with your subscription. If not, choose the next lowest tier.
38
+
1. Create an IoT Hub using the [az iot hub create](/cli/azure/iot/hub#az-iot-hub-create) command. This example creates an IoT Hub called **MyASAIoTHub**. Because IoT Hub names are unique, you need to come up with your own IoT Hub name. Set the SKU to F1 to use the free tier if it's available with your subscription. If not, choose the next lowest tier.
40
39
41
40
```azurecli
42
41
az iot hub create --name "MyASAIoTHub" --resource-group streamanalyticsrg --sku S1
@@ -45,7 +44,7 @@ The following Azure CLI code blocks are commands that prepare the input data req
45
44
Once the IoT hub has been created, get the IoT Hub connection string using the [az iot hub show-connection-string](/cli/azure/iot/hub) command. Copy the entire connection string and save it for when you add the IoT Hub as input to your Stream Analytics job.
46
45
47
46
```azurecli
48
-
az iot hub show-connection-string --hub-name "MyASAIoTHub"
47
+
az iot hub connection-string show --hub-name "MyASAIoTHub"
49
48
```
50
49
51
50
2. Add a device to IoT Hub using the [az iothub device-identity create](/cli/azure/iot/hub/device-identity#az-iot-hub-device-identity-create) command. This example creates a device called **MyASAIoTDevice**.
@@ -70,128 +69,74 @@ The following Azure CLI code blocks are commands that prepare the input data req
70
69
71
70
The following Azure CLI code blocks create a blob storage account that's used for job output. Review the sections to understand the code.
72
71
73
-
1. Create a general-purpose storage account with the [az storage account create](/cli/azure/storage/account) command. The general-purpose storage account can be used for all four services: blobs, files, tables, and queues.
72
+
1. Define a variable to hold the name for storageaccount.
74
73
75
-
Remember to replace placeholder values in angle brackets with your own values:
74
+
```azurecli
75
+
storageAccountName="asatutorialstorage$RANDOM"
76
+
```
77
+
1. Create a general-purpose storage account with the [az storage account create](/cli/azure/storage/account) command. The general-purpose storage account can be used for all four services: blobs, files, tables, and queues.
76
78
77
79
```azurecli
78
80
az storage account create \
79
-
--name <storage-account> \
81
+
--name $storageAccountName \
80
82
--resource-group streamanalyticsrg \
81
83
--location eastus \
82
84
--sku Standard_ZRS \
83
85
--encryption-services blob
84
86
```
85
87
86
-
2. Get the key for your storage account by running the [az storage account keys list](/cli/azure/storage/account/keys) command. Save this key to use in the next step.
88
+
2. Get the key for your storage account by running the [az storage account keys list](/cli/azure/storage/account/keys) command.
87
89
88
90
```azurecli
89
-
az storage account keys list -g streamanalyticsrg -n <storage-account>
3. Create a container for storing blobs with the [az storage container create](/cli/azure/storage/container) command. You use the storage account key to authorize the operation to create the container. For more information about authorizing data operations with Azure CLI, see [Authorize access to blob or queue data with Azure CLI](../storage/blobs/authorize-data-operations-cli.md).
93
95
94
96
```azurecli
95
97
az storage container create \
96
-
--account-name <storage-account> \
98
+
--account-name $storageAccountName \
97
99
--name sample-container \
98
-
--account-key <key>
100
+
--account-key $key \
99
101
--auth-mode key
100
102
```
101
103
102
104
## Create a Stream Analytics job
103
105
104
-
The following Azure CLI code blocks create a Stream Analytics job. Review the sections to understand the code
105
-
106
-
1. Create a Stream Analytics job with the [az stream-analytics job create](/cli/azure/stream-analytics/job#az-stream-analytics-job-create) command.
106
+
Create a Stream Analytics job with the [az stream-analytics job create](/cli/azure/stream-analytics/job#az-stream-analytics-job-create) command. Review the next sections to understand the code
107
107
108
108
```azurecli
109
109
az stream-analytics job create \
110
-
--resource-group streamanalyticsrg
111
-
--name streamanalyticsjob \
112
-
--location eastus \
110
+
--job-name "streamanalyticsjob" \
111
+
--resource-group "streamanalyticsrg" \
112
+
--location "eastus" \
113
113
--output-error-policy "Drop" \
114
-
--events-outoforder-policy "Drop" \
115
-
--events-outoforder-max-delay 5 \
116
-
--events-late-arrival-max-delay 16 \
114
+
--out-of-order-policy "Drop" \
115
+
--order-max-delay 5 \
116
+
--arrival-max-delay 16 \
117
117
--data-locale "en-US"
118
118
```
119
119
120
120
## Configure input to the job
121
121
122
122
Add an input to your job by using the [az stream-analytics input](/cli/azure/stream-analytics/input#az-stream-analytics-input-create) cmdlet. This cmdlet takes the job name, job input name, resource group name, and the job input definition as parameters. The job input definition is a JSON file that contains the properties required to configure the job's input. In this example, you'll create an IoT Hub as an input.
123
123
124
-
On your local machine, create a file named `datasource.json` and add the following JSON data to it. Make sure to replace the value for `sharedAccessPolicyKey` with the `SharedAccessKey` portion of the IoT Hub connection string you saved in a previous section.
On your local machine, create a file named `serialization.json` and add the following JSON data to it.
140
-
141
-
```json
142
-
{
143
-
"type": "Json",
144
-
"properties": {
145
-
"encoding": "UTF8"
146
-
}
147
-
}
148
-
```
149
-
150
-
Next, run the `az stream-analytics input create` cmdlet. Be sure to replace the value of `datasource` variable with the path where you've stored the job input definition JSON file, and the value of `serialization` variable with the path where you've stored the serialization JSON file.
124
+
> [!IMPORTANT]
125
+
> Replace `<IOT HUB ACCESS KEY>` with the value of Shared Access Key in the IOT Hub connection string you saved. For example, if the IOT Hub connection string is: `HostName=MyASAIoTHub.azure-devices.net;SharedAccessKeyName=iothubowner;SharedAccessKey=xxxxxxxxxxxxxx=`, the Shared Access Key value is `xxxxxxxxxxxxxx=`.
Add an output to your job by using the [az stream-analytics output create](/cli/azure/stream-analytics/output#az-stream-analytics-output-create) cmdlet. This cmdlet takes the job name, job output name, resource group name, and the job output definition as parameters. The job output definition is a JSON file that contains the properties required to configure job's output. This example uses blob storage as output.
165
-
166
-
On your local machine, create a file named `datasink.json`, and add the following JSON data to it. Make sure to replace the value for `accountKey` with your storage account's access key that is the value stored in $storageAccountKey value.
167
-
168
-
```json
169
-
{
170
-
"type": "Microsoft.Storage/Blob",
171
-
"properties": {
172
-
"storageAccounts": [
173
-
{
174
-
"accountName": "<storage-account>",
175
-
"accountKey": "accountKey=="
176
-
}
177
-
],
178
-
"container": "state",
179
-
"pathPattern": "{date}/{time}",
180
-
"dateFormat": "yyyy/MM/dd",
181
-
"timeFormat": "HH"
182
-
}
183
-
}
184
-
```
133
+
Add an output to your job by using the [az stream-analytics output create](/cli/azure/stream-analytics/output#az-stream-analytics-output-create) cmdlet. This cmdlet takes the job name, job output name, resource group name, and the job output definition as parameters.
185
134
186
-
Next, run the `az stream-analytics output` cmdlet. Be sure to replace the value of `datasource` variable with the path where you've stored the job output definition JSON file, and the value of `serialization` variable with the path where you've stored the serialization JSON file.
135
+
> [!IMPORTANT]
136
+
> Replace `<STORAGE ACCOUNT NAME>` with the name of your Azure Storage account and `<ACCESS KEY>` with the access key for your storage account.
Copy file name to clipboardExpand all lines: articles/stream-analytics/stream-analytics-add-inputs.md
+8-5Lines changed: 8 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,11 +1,11 @@
1
1
---
2
2
title: Understand inputs for Azure Stream Analytics
3
-
description: This article describe the concept of inputs in an Azure Stream Analytics job, comparing streaming input to reference data input.
3
+
description: This article describes the concept of inputs in an Azure Stream Analytics job, comparing streaming input to reference data input.
4
4
ms.service: stream-analytics
5
5
author: enkrumah
6
6
ms.author: ebnkruma
7
7
ms.topic: conceptual
8
-
ms.date: 10/29/2020
8
+
ms.date: 02/28/2023
9
9
---
10
10
# Understand inputs for Azure Stream Analytics
11
11
@@ -25,15 +25,18 @@ You can use the [Azure portal](stream-analytics-quick-create-portal.md#configure
25
25
> We strongly recommend using [**Stream Analytics tools for Visual Studio Code**](./quick-create-visual-studio-code.md) for best local development experience. There are known feature gaps in Stream Analytics tools for Visual Studio 2019 (version 2.6.3000.0) and it won't be improved going forward.
26
26
27
27
## Stream and reference inputs
28
-
As data is pushed to a data source, it's consumed by the Stream Analytics job and processed in real time. Inputs are divided into two types: data stream inputs and reference data inputs.
28
+
As data is pushed to a data source, it's consumed by the Stream Analytics job and processed in real time. Inputs are divided into two types:
29
+
30
+
- Data stream inputs
31
+
- Reference data inputs.
29
32
30
33
### Data stream input
31
-
A data stream is an unbounded sequence of events over time. Stream Analytics jobs must include at least one data stream input. Event Hubs, IoT Hub, Azure Data Lake Storage Gen2 and Blob storage are supported as data stream input sources. Event Hubs are used to collect event streams from multiple devices and services. These streams might include social media activity feeds, stock trade information, or data from sensors. IoT Hubs are optimized to collect data from connected devices in Internet of Things (IoT) scenarios. Blob storage can be used as an input source for ingesting bulk data as a stream, such as log files.
34
+
A data stream is an unbounded sequence of events over time. Stream Analytics jobs must include at least one data stream input. Event Hubs, IoT Hub, Azure Data Lake Storage Gen2 and Blob storage are supported as data stream input sources. Event Hubs is used to collect event streams from multiple devices and services. These streams might include social media activity feeds, stock trade information, or data from sensors. IoT Hubs are optimized to collect data from connected devices in Internet of Things (IoT) scenarios. Blob storage can be used as an input source for ingesting bulk data as a stream, such as log files.
32
35
33
36
For more information about streaming data inputs, see [Stream data as input into Stream Analytics](stream-analytics-define-inputs.md)
34
37
35
38
### Reference data input
36
-
Stream Analytics also supports input known as *reference data*. Reference data is either completely static or changes slowly. It is typically used to perform correlation and lookups. For example, you might join data in the data stream input to data in the reference data, much as you would perform a SQL join to look up static values. Azure Blob storage, Azure Data Lake Storage Gen2, and Azure SQL Database are currently supported as input sources for reference data. Reference data source blobs have a limit of up to 300 MB in size, depending on the query complexity and allocated Streaming Units (see the [Size limitation](stream-analytics-use-reference-data.md#size-limitation) section of the reference data documentation for more details).
39
+
Stream Analytics also supports input known as *reference data*. Reference data is either completely static or changes slowly. It's typically used to perform correlation and lookups. For example, you might join data in the data stream input to data in the reference data, much as you would perform a SQL join to look up static values. Azure Blob storage, Azure Data Lake Storage Gen2, and Azure SQL Database are currently supported as input sources for reference data. Reference data source blobs have a limit of up to 300 MB in size, depending on the query complexity and allocated Streaming Units. For more information, see the [Size limitation](stream-analytics-use-reference-data.md#size-limitation) section of the reference data documentation.
37
40
38
41
For more information about reference data inputs, see [Using reference data for lookups in Stream Analytics](stream-analytics-use-reference-data.md)
0 commit comments