Skip to content

Commit 094cc8d

Browse files
committed
updates
1 parent ab6bf69 commit 094cc8d

File tree

1 file changed

+42
-11
lines changed

1 file changed

+42
-11
lines changed

articles/stream-analytics/quick-create-azure-cli.md

Lines changed: 42 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -69,7 +69,7 @@ The following Azure CLI code blocks are commands that prepare the input data req
6969
7070
The following Azure CLI code blocks create a blob storage account that's used for job output. Review the sections to understand the code.
7171
72-
1. Define a variable to hold the name for storage account.
72+
1. Define a variable to hold the name for storage account. If this command gives an error, switch to the Bash shell in the Azure Cloud Shell.
7373
7474
```azurecli
7575
storageAccountName="asatutorialstorage$RANDOM"
@@ -87,16 +87,19 @@ The following Azure CLI code blocks create a blob storage account that's used fo
8787

8888
2. Get the key for your storage account by running the [az storage account keys list](/cli/azure/storage/account/keys) command.
8989

90-
```azurecli
91-
key=$(az storage account keys list -g streamanalyticsrg -n $storageAccountName --query "[0].value" -o tsv)
92-
```
93-
94-
3. Create a container for storing blobs with the [az storage container create](/cli/azure/storage/container) command. You use the storage account key to authorize the operation to create the container. For more information about authorizing data operations with Azure CLI, see [Authorize access to blob or queue data with Azure CLI](../storage/blobs/authorize-data-operations-cli.md).
90+
```azurecli
91+
key=$(az storage account keys list -g streamanalyticsrg -n $storageAccountName --query "[0].value" -o tsv)
92+
echo $key
93+
```
94+
95+
> [!NOTE]
96+
> Note down the access key for the Azure storage account. You will use this key later in this quickstart.
97+
3. Create a container named `state` for storing blobs with the [az storage container create](/cli/azure/storage/container) command. You use the storage account key to authorize the operation to create the container. For more information about authorizing data operations with Azure CLI, see [Authorize access to blob or queue data with Azure CLI](../storage/blobs/authorize-data-operations-cli.md).
9598
9699
```azurecli
97100
az storage container create \
98101
--account-name $storageAccountName \
99-
--name sample-container \
102+
--name state \
100103
--account-key $key \
101104
--auth-mode key
102105
```
@@ -122,21 +125,27 @@ az stream-analytics job create \
122125
Add an input to your job by using the [az stream-analytics input](/cli/azure/stream-analytics/input#az-stream-analytics-input-create) cmdlet. This cmdlet takes the job name, job input name, resource group name, and the job input definition as parameters. The job input definition is a JSON file that contains the properties required to configure the job's input. In this example, you'll create an IoT Hub as an input.
123126

124127
> [!IMPORTANT]
125-
> Replace `<IOT HUB ACCESS KEY>` with the value of Shared Access Key in the IOT Hub connection string you saved. For example, if the IOT Hub connection string is: `HostName=MyASAIoTHub.azure-devices.net;SharedAccessKeyName=iothubowner;SharedAccessKey=xxxxxxxxxxxxxx=`, the Shared Access Key value is `xxxxxxxxxxxxxx=`.
128+
> Replace `IOT HUB ACCESS KEY` with the value of Shared Access Key in the IOT Hub connection string you saved. For example, if the IOT Hub connection string is: `HostName=MyASAIoTHub.azure-devices.net;SharedAccessKeyName=iothubowner;SharedAccessKey=xxxxxxxxxxxxxx=`, the Shared Access Key value is `xxxxxxxxxxxxxx=`.
126129
127130
```azurecli
128-
az stream-analytics input create --properties "{\"type\":\"Stream\",\"datasource\":{\"type\":\"Microsoft.Devices/IotHubs\",\"properties\":{\"consumerGroupName\":\"$Default\",\"endpoint\":\"messages/events\",\"iotHubNamespace\":\"iothub\",\"sharedAccessPolicyKey\":\"<IOT HUB ACCESS KEY>",\"sharedAccessPolicyName\":\"iothubowner\"}},\"serialization\":{\"type\":\"Json\",\"encoding\":\"UTF8\"}}" --input-name "asaiotinput" --job-name "streamanalyticsjob" --resource-group "streamanalyticsrg"
131+
az stream-analytics input create \
132+
--properties "{\"type\":\"Stream\",\"datasource\":{\"type\":\"Microsoft.Devices/IotHubs\",\"properties\":{\"consumerGroupName\":\"\sdkconsumergroup\\",\"endpoint\":\"messages/events\",\"iotHubNamespace\":\"MyASAIoTHub\",\"sharedAccessPolicyKey\":\"IOT HUB ACCESS KEY\",\"sharedAccessPolicyName\":\"iothubowner\"}},\"serialization\":{\"type\":\"Json\",\"encoding\":\"UTF8\"}}" --input-name "asaiotinput" --job-name "streamanalyticsjob" --resource-group "streamanalyticsrg"
129133
```
130134

131135
## Configure output to the job
132136

133137
Add an output to your job by using the [az stream-analytics output create](/cli/azure/stream-analytics/output#az-stream-analytics-output-create) cmdlet. This cmdlet takes the job name, job output name, resource group name, and the job output definition as parameters.
134138

135139
> [!IMPORTANT]
136-
> Replace `<STORAGE ACCOUNT NAME>` with the name of your Azure Storage account and `<ACCESS KEY>` with the access key for your storage account.
140+
> Replace `STORAGEACCOUNTNAME>` with the name of your Azure Storage account and `STORAGEACCESSKEY>` with the access key for your storage account. If you didn't note down these values, run the following commands to get them: `echo $storageAccountName` and `echo $key`.
137141
138142
```azurecli
139-
az stream-analytics output create --job-name streamanalyticsjob --datasource "{\"type\":\"Microsoft.Storage/Blob\",\"properties\":{\"container\":\"state\",\"dateFormat\":\"yyyy/MM/dd\",\"pathPattern\":\"{date}/{time}\",\"storageAccounts\":[{\"accountKey\":\"<ACCESS KEY>",\"accountName\":\"<STORAGE ACCOUNT NAME>\"}],\"timeFormat\":\"HH\"}}" --serialization "{\"type\":\"Json\",\"properties\":{\"format\":\"Array\",\"encoding\":\"UTF8\"}}" --output-name asabloboutput --resource-group streamanalyticsrg
143+
az stream-analytics output create \
144+
--job-name streamanalyticsjob \
145+
--datasource "{\"type\":\"Microsoft.Storage/Blob\",\"properties\":{\"container\":\"state\",\"dateFormat\":\"yyyy/MM/dd\",\"pathPattern\":\"{date}/{time}\",\"storageAccounts\":[{\"accountKey\":\"STORAGEACCESSKEY\",\"accountName\":\"STORAGEACCOUNTNAME\"}],\"timeFormat\":\"HH\"}}" \
146+
--serialization "{\"type\":\"Json\",\"properties\":{\"format\":\"Array\",\"encoding\":\"UTF8\"}}" \
147+
--output-name asabloboutput \
148+
--resource-group streamanalyticsrg
140149
```
141150

142151
## Define the transformation query
@@ -176,6 +185,28 @@ az stream-analytics job start \
176185
--output-start-mode JobStartTime
177186
```
178187

188+
You see entries similar to the following one in the blob container.
189+
190+
```json
191+
{
192+
"messageId": 229,
193+
"deviceId": "Raspberry Pi Web Client",
194+
"temperature": 31.85214010589595,
195+
"humidity": 60.278830289656284,
196+
"EventProcessedUtcTime": "2023-02-28T22:06:33.5567789Z",
197+
"PartitionId": 3,
198+
"EventEnqueuedUtcTime": "2023-02-28T22:05:49.6520000Z",
199+
"IoTHub": {
200+
"MessageId": null,
201+
"CorrelationId": null,
202+
"ConnectionDeviceId": "MyASAIoTDevice",
203+
"ConnectionDeviceGenerationId": "638132150746523845",
204+
"EnqueuedTime": "2023-02-28T22:05:49.6520000Z",
205+
"StreamId": null
206+
}
207+
}
208+
```
209+
179210
## Clean up resources
180211

181212
When no longer needed, delete the resource group, the streaming job, and all related resources. Deleting the job avoids billing the streaming units consumed by the job. If you're planning to use the job in future, you can skip deleting it, and stop the job for now. If you aren't going to continue to use this job, delete all resources created by this quickstart by running the following cmdlet:

0 commit comments

Comments
 (0)