You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
# Quickstart: Create an Azure Stream Analytics job using the Azure CLI
16
-
17
-
In this quickstart, you use the Azure CLI to define a Stream Analytics job that filters real-time sensor messages with a temperature reading greater than 27. Your Stream Analytics job will read data from IoT Hub, transform the data, and write the data back to a container in blob storage. The input data used in this quickstart is generated by a Raspberry Pi online simulator.
16
+
In this quickstart, you will use Azure CLI to define a Stream Analytics job that filters real-time sensor messages with a temperature reading greater than 27. The Stream Analytics job reads data from IoT Hub, transforms the data, and writes the output data to a container in a blob storage. The input data used in this quickstart is generated by a Raspberry Pi online simulator.
18
17
19
18
## Before you begin
20
19
@@ -24,40 +23,39 @@ In this quickstart, you use the Azure CLI to define a Stream Analytics job that
24
23
25
24
- Create a resource group. All Azure resources must be deployed into a resource group. Resource groups allow you to organize and manage related Azure resources.
26
25
27
-
For this quickstart, create a resource group named *streamanalyticsrg* in the *eastus* location with the following [az group create](/cli/azure/group#az-group-create) command:
26
+
For this quickstart, create a resource group named **streamanalyticsrg** in the **eastus** location with the following [az group create](/cli/azure/group#az-group-create) command:
28
27
29
28
```azurecli
30
29
az group create --name streamanalyticsrg --location eastus
31
30
```
32
31
33
32
## Prepare the input data
34
33
35
-
Before you define the Stream Analytics job, prepare the data that's used for the job's input.
36
-
37
-
The following Azure CLI code blocks are commands that prepare the input data required by the job. Review the sections to understand the code.
34
+
Before you define the Stream Analytics job, prepare the data that's used for the job's input. The following Azure CLI commands prepare the **input** data required by the job.
38
35
39
-
1. Create an IoT Hub using the [az iot hub create](/cli/azure/iot/hub#az-iot-hub-create) command. This example creates an IoT Hub called **MyASAIoTHub**. Because IoT Hub names are unique, you need to come up with your own IoT Hub name. Set the SKU to F1 to use the free tier if it is available with your subscription. If not, choose the next lowest tier.
36
+
1. Create an IoT Hub using the [az iot hub create](/cli/azure/iot/hub#az-iot-hub-create) command. This example creates an IoT Hub called **MyASAIoTHub**. As IoT Hub names must be globally unique, you may have to change the name if it's already taken. Set the SKU to F1 to use the free tier if it's available with your subscription. If not, choose the next lowest tier.
40
37
41
38
```azurecli
42
-
az iot hub create --name "MyASAIoTHub" --resource-group streamanalyticsrg --sku S1
39
+
iotHubName=MyASAIoTHub
40
+
az iot hub create --name $iotHubName --resource-group streamanalyticsrg --sku S1
43
41
```
44
42
45
-
Once the IoT hub has been created, get the IoT Hub connection string using the [az iot hub show-connection-string](/cli/azure/iot/hub) command. Copy the entire connection string and save it for when you add the IoT Hub as input to your Stream Analytics job.
43
+
Once the IoT hub has been created, get the IoT Hub connection string using the [az iot hub connection-string show](/cli/azure/iot/hub/connection-string#az-iot-hub-connection-string-show) command. Copy the entire connection string and save it. You use it while adding the IoT Hub as an input to your Stream Analytics job.
46
44
47
45
```azurecli
48
-
az iot hub show-connection-string --hub-name "MyASAIoTHub"
46
+
az iot hub connection-string show --hub-name $iotHubName
49
47
```
50
48
51
-
2. Add a device to IoT Hub using the [az iothub device-identity create](/cli/azure/iot/hub/device-identity#az-iot-hub-device-identity-create) command. This example creates a device called **MyASAIoTDevice**.
49
+
2. Add a device to IoT Hub using the [az iothub device-identity create](/cli/azure/iot/hub/device-identity#az-iot-hub-device-identity-create) command. This example creates a device called **MyASAIoTDevice**.
52
50
53
51
```azurecli
54
-
az iot hub device-identity create --hub-name "MyASAIoTHub" --device-id "MyASAIoTDevice"
52
+
az iot hub device-identity create --hub-name $iotHubName --device-id "MyASAIoTDevice"
55
53
```
56
54
57
55
3. Get the device connection string using the [az iot hub device-identity connection-string show](/cli/azure/iot/hub/device-identity/connection-string#az-iot-hub-device-identity-connection-string-show) command. Copy the entire connection string and save it for when you create the Raspberry Pi simulator.
58
56
59
57
```azurecli
60
-
az iot hub device-identity connection-string show --hub-name "MyASAIoTHub" --device-id "MyASAIoTDevice" --output table
58
+
az iot hub device-identity connection-string show --hub-name $iotHubName --device-id "MyASAIoTDevice" --output table
61
59
```
62
60
63
61
**Output example:**
@@ -68,161 +66,114 @@ The following Azure CLI code blocks are commands that prepare the input data req
68
66
69
67
## Create a blob storage account
70
68
71
-
The following Azure CLI code blocks create a blob storage account that's used for job output. Review the sections to understand the code.
69
+
The following Azure CLI commands create a blob **storage account** that's used for job **output**.
72
70
73
71
1. Create a general-purpose storage account with the [az storage account create](/cli/azure/storage/account) command. The general-purpose storage account can be used for all four services: blobs, files, tables, and queues.
74
72
75
-
Remember to replace placeholder values in angle brackets with your own values:
76
-
77
73
```azurecli
74
+
storageAccountName="asatutorialstorage$RANDOM"
78
75
az storage account create \
79
-
--name <storage-account> \
76
+
--name $storageAccountName \
80
77
--resource-group streamanalyticsrg \
81
78
--location eastus \
82
79
--sku Standard_ZRS \
83
80
--encryption-services blob
84
81
```
85
82
86
-
2. Get the key for your storage account by running the [az storage account keys list](/cli/azure/storage/account/keys) command. Save this key to use in the next step.
87
-
88
-
```azurecli
89
-
az storage account keys list -g streamanalyticsrg -n <storage-account>
90
-
```
83
+
2. Get the key for your storage account by running the [az storage account keys list](/cli/azure/storage/account/keys) command.
91
84
92
-
3. Create a container for storing blobs with the [az storage container create](/cli/azure/storage/container) command. You use the storage account key to authorize the operation to create the container. For more information about authorizing data operations with Azure CLI, see [Authorize access to blob or queue data with Azure CLI](../storage/blobs/authorize-data-operations-cli.md).
> Note down the access key for the Azure storage account. You will use this key later in this quickstart.
92
+
3. Create a container named `state` for storing blobs with the [az storage container create](/cli/azure/storage/container) command. You use the storage account key to authorize the operation to create the container. For more information about authorizing data operations with Azure CLI, see [Authorize access to blob or queue data with Azure CLI](../storage/blobs/authorize-data-operations-cli.md).
93
93
94
94
```azurecli
95
95
az storage container create \
96
-
--account-name <storage-account> \
97
-
--name sample-container \
98
-
--account-key <key>
96
+
--account-name $storageAccountName \
97
+
--name state \
98
+
--account-key $key \
99
99
--auth-mode key
100
100
```
101
101
102
102
## Create a Stream Analytics job
103
103
104
-
The following Azure CLI code blocks create a Stream Analytics job. Review the sections to understand the code
105
-
106
-
1. Create a Stream Analytics job with the [az stream-analytics job create](/cli/azure/stream-analytics/job#az-stream-analytics-job-create) command.
104
+
Create a Stream Analytics job with the [az stream-analytics job create](/cli/azure/stream-analytics/job#az-stream-analytics-job-create) command.
107
105
108
106
```azurecli
109
107
az stream-analytics job create \
110
-
--resource-group streamanalyticsrg
111
-
--name streamanalyticsjob \
112
-
--location eastus \
108
+
--job-name "streamanalyticsjob" \
109
+
--resource-group "streamanalyticsrg" \
110
+
--location "eastus" \
113
111
--output-error-policy "Drop" \
114
-
--events-outoforder-policy "Drop" \
115
-
--events-outoforder-max-delay 5 \
116
-
--events-late-arrival-max-delay 16 \
112
+
--out-of-order-policy "Drop" \
113
+
--order-max-delay 5 \
114
+
--arrival-max-delay 16 \
117
115
--data-locale "en-US"
118
116
```
119
117
120
118
## Configure input to the job
121
119
122
-
Add an input to your job by using the [az stream-analytics input](/cli/azure/stream-analytics/input#az-stream-analytics-input-create) cmdlet. This cmdlet takes the job name, job input name, resource group name, and the job input definition as parameters. The job input definition is a JSON file that contains the properties required to configure the job's input. In this example, you'll create an IoT Hub as an input.
123
-
124
-
On your local machine, create a file named `datasource.json` and add the following JSON data to it. Make sure to replace the value for `sharedAccessPolicyKey` with the `SharedAccessKey` portion of the IoT Hub connection string you saved in a previous section.
120
+
Add an input to your job by using the [az stream-analytics input](/cli/azure/stream-analytics/input#az-stream-analytics-input-create) cmdlet. This cmdlet takes the job name, job input name, resource group name, and the input properties in JSON format as parameters. In this example, you'll create an IoT Hub as an input.
On your local machine, create a file named `serialization.json` and add the following JSON data to it.
140
-
141
-
```json
142
-
{
143
-
"type": "Json",
144
-
"properties": {
145
-
"encoding": "UTF8"
146
-
}
147
-
}
148
-
```
149
-
150
-
Next, run the `az stream-analytics input create` cmdlet. Be sure to replace the value of `datasource` variable with the path where you've stored the job input definition JSON file, and the value of `serialization` variable with the path where you've stored the serialization JSON file.
122
+
> [!IMPORTANT]
123
+
> - Replace `IOT HUB ACCESS KEY` with the value of Shared Access Key in the IOT Hub connection string you saved. For example, if the IOT Hub connection string is: `HostName=MyASAIoTHub.azure-devices.net;SharedAccessKeyName=iothubowner;SharedAccessKey=xxxxxxxxxxxxxx=`, the Shared Access Key value is `xxxxxxxxxxxxxx=`. While replacing the value, make sure that you don't delete `\` (escape) character for `"` (double quotes).
124
+
> - Update the value of `iotHubNamespace` in the following command if you used a name other than `MyASAIoTHub`. Run `echo $iotHubName` to see the name of your IoT Hub.
Add an output to your job by using the [az stream-analytics output create](/cli/azure/stream-analytics/output#az-stream-analytics-output-create) cmdlet. This cmdlet takes the job name, job output name, resource group name, and the job output definition as parameters. The job output definition is a JSON file that contains the properties required to configure job's output. This example uses blob storage as output.
136
+
Add an output to your job by using the [az stream-analytics output create](/cli/azure/stream-analytics/output#az-stream-analytics-output-create) cmdlet. This cmdlet takes the job name, job output name, resource group name, data source in JSON format, and serialization type as parameters.
165
137
166
-
On your local machine, create a file named `datasink.json`, and add the following JSON data to it. Make sure to replace the value for `accountKey` with your storage account's access key that is the value stored in $storageAccountKey value.
167
-
168
-
```json
169
-
{
170
-
"type": "Microsoft.Storage/Blob",
171
-
"properties": {
172
-
"storageAccounts": [
173
-
{
174
-
"accountName": "<storage-account>",
175
-
"accountKey": "accountKey=="
176
-
}
177
-
],
178
-
"container": "state",
179
-
"pathPattern": "{date}/{time}",
180
-
"dateFormat": "yyyy/MM/dd",
181
-
"timeFormat": "HH"
182
-
}
183
-
}
184
-
```
185
-
186
-
Next, run the `az stream-analytics output` cmdlet. Be sure to replace the value of `datasource` variable with the path where you've stored the job output definition JSON file, and the value of `serialization` variable with the path where you've stored the serialization JSON file.
138
+
> [!IMPORTANT]
139
+
> Replace `STORAGEACCOUNTNAME>` with the name of your Azure Storage account and `STORAGEACCESSKEY>` with the access key for your storage account. If you didn't note down these values, run the following commands to get them: `echo $storageAccountName` and `echo $key`. While replacing the values, make sure that you don't delete `\` (escape) character for `"` (double quotes).
Add a transformation your job by using the [az stream-analytics transformation create](/cli/azure/stream-analytics/transformation#az-stream-analytics-transformation-create) cmdlet. This cmdlet takes the job name, job transformation name, resource group name, and the job transformation definition as parameters.
200
-
201
-
Run the `az stream-analytics transformation create` cmdlet.
152
+
Add a transformation your job by using the [az stream-analytics transformation create](/cli/azure/stream-analytics/transformation#az-stream-analytics-transformation-create) cmdlet.
202
153
203
154
```azurecli
204
155
az stream-analytics transformation create \
205
156
--resource-group streamanalyticsrg \
206
157
--job-name streamanalyticsjob \
207
158
--name Transformation \
208
159
--streaming-units "6" \
209
-
--transformation-query "SELECT * INTO asabloboutput FROM asaiotinput WHERE Temperature > 27"
160
+
--saql "SELECT * INTO asabloboutput FROM asaiotinput WHERE Temperature > 27"
210
161
```
211
162
## Run the IoT simulator
212
163
213
164
1. Open the [Raspberry Pi Azure IoT Online Simulator](https://azure-samples.github.io/raspberry-pi-web-simulator/).
214
165
215
-
2. Replace the placeholder in Line 15 with the entire Azure IoT Hub Device connection stringyou saved in a previous section.
166
+
2. Replace the placeholder in line 15 with the entire Azure IoT Hub **Device connection string** (not IoT Hub connection string) you saved at the beginning of the quickstart.
216
167
217
-
3.Click**Run**. The output should show the sensor data and messages that are being sent to your IoT Hub.
168
+
3.Select**Run**. The output should show the sensor data and messages that are being sent to your IoT Hub.
218
169
219
170

220
171
221
172
## Start the Stream Analytics job and check the output
222
173
223
174
Start the job by using the [az stream-analytics job start](/cli/azure/stream-analytics/job#az-stream-analytics-job-start) cmdlet. This cmdlet takes the job name, resource group name, output start mode, and start time as parameters. `OutputStartMode` accepts values of `JobStartTime`, `CustomTime`, or `LastOutputEventTime`.
224
175
225
-
After you run the following cmdlet, it returns `True` as output if the job starts. In the storage container, an output folder is created with the transformed data.
176
+
After you run the following cmdlet, it returns `True` as output if the job starts.
226
177
227
178
```azurecli
228
179
az stream-analytics job start \
@@ -231,9 +182,34 @@ az stream-analytics job start \
231
182
--output-start-mode JobStartTime
232
183
```
233
184
234
-
## Clean up resources
185
+
Give it a few minutes and then verify that an output file is created in the `state` blob container.
186
+
187
+
:::image type="content" source="./media/stream-analytics-quick-create-powershell/output-file-container.png" alt-text="Screenshot showing the output file in the State blob container.":::
188
+
189
+
Download and open the file to see several entries similar to the following one:
When no longer needed, delete the resource group, the streaming job, and all related resources. Deleting the job avoids billing the streaming units consumed by the job. If you're planning to use the job in future, you can skip deleting it, and stop the job for now. If you aren't going to continue to use this job, delete all resources created by this quickstart by running the following cmdlet:
211
+
## Clean up resources
212
+
Delete the resource group, which will delete all the resources in the resource group including Stream Analytics job, IoT Hub, and Azure Storage account.
0 commit comments