Skip to content

Commit a39560f

Browse files
authored
Merge pull request #289218 from dominicbetts/release-aio-m3-tutorial-updates
AIO [M3]: Tutorial updates
2 parents dd363cf + eb5518c commit a39560f

File tree

6 files changed

+44
-18
lines changed

6 files changed

+44
-18
lines changed
48.4 KB
Loading
47 KB
Loading
Loading
64.2 KB
Loading
60 KB
Loading

articles/iot-operations/end-to-end-tutorials/tutorial-upload-telemetry-to-cloud.md

Lines changed: 44 additions & 18 deletions
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@ ms.service: azure-iot-operations
1616

1717
[!INCLUDE [public-preview-note](../includes/public-preview-note.md)]
1818

19-
In this tutorial, you use a dataflow to forward messages from the MQTT broker to an event hub in the Azure Event Hubs service. The event hub can deliver the data to other cloud services for storage and analysis. In the next tutorial, you use a Real-Time Dashboard to visualize the data.
19+
In this tutorial, you use a dataflow to forward messages from the MQTT broker to an event hub in the Azure Event Hubs service. The event hub can deliver the data to other cloud services for storage and analysis. In the next tutorial, you use a real-time dashboard to visualize the data.
2020

2121
## Prerequisites
2222

@@ -100,39 +100,65 @@ az role assignment create --role "Azure Event Hubs Data Sender" --assignee $PRIN
100100

101101
## Create a dataflow to send telemetry to an event hub
102102

103-
To create and configure a dataflow in your cluster, run the following commands in your shell. This dataflow:
103+
Use the operations experience UI to create and configure a dataflow in your cluster that:
104104

105105
- Renames the `Tag 10` field in the incoming message to `Humidity`.
106106
- Renames the `temperature` field in the incoming message to `Temperature`.
107-
- Adds a field called `AssetId` that contains the value of the `externalAssetId` message property.
107+
- Adds a field called `AssetId` that contains the name of the asset.
108108
- Forwards the transformed messages from the MQTT topic to the event hub you created.
109109

110-
<!-- TODO: Change branch to main before merging the release branch -->
110+
To create the dataflow:
111111

112-
# [Bash](#tab/bash)
112+
1. Browse to the operations experience UI and locate your instance. Then select **Dataflow endpoints** and select **+ New** in the **Azure Event Hubs** tile:
113113

114-
```bash
115-
wget https://raw.githubusercontent.com/Azure-Samples/explore-iot-operations/main/samples/tutorials/dataflow.yaml
116-
sed -i 's/<NAMESPACE>/'"${CLUSTER_NAME:0:24}"'/' dataflow.yaml
114+
:::image type="content" source="media/tutorial-upload-telemetry-to-cloud/new-event-hubs-endpoint.png" alt-text="Screenshot of the Dataflow endpoints page.":::
117115

118-
kubectl apply -f dataflow.yaml
119-
```
116+
1. In the **Create new dataflow endpoint: Azure Event Hubs**, enter *event-hubs-target* as the name, and update the **Host** field with the address of the Event Hubs namespace you created. Select **Apply**:
120117

121-
# [PowerShell](#tab/powershell)
118+
:::image type="content" source="media/tutorial-upload-telemetry-to-cloud/new-event-hubs-destination.png" alt-text="Screenshot of the Create new dataflow endpoint: Azure Event Hubs page.":::
122119

123-
```powershell
124-
Invoke-WebRequest -Uri https://raw.githubusercontent.com/Azure-Samples/explore-iot-operations/main/samples/tutorials/dataflow.yaml -OutFile dataflow.yaml
120+
Your new dataflow endpoint is created and displays in the list on the **Dataflow endpoints** page.
125121

126-
(Get-Content dataflow.yaml) | ForEach-Object { $_ -replace '<NAMESPACE>', $CLUSTER_NAME.Substring(0, [MATH]::Min($CLUSTER_NAME.Length, 24)) } | Set-Content dataflow.yaml
122+
1. Select **Dataflows** and then select **+ Create dataflow**. The **\<new-dataflow\>** page displays:
127123

128-
kubectl apply -f dataflow.yaml
129-
```
124+
:::image type="content" source="media/tutorial-upload-telemetry-to-cloud/new-dataflow.png" alt-text="Screenshot of the Dataflows page.":::
130125

131-
---
126+
1. In the dataflow editor, select **Select source**. Then select the thermostat asset you created previously and select **Apply**.
127+
128+
1. In the dataflow editor, select **Select dataflow endpoint**. Then select the **event-hubs-target**** endpoint you created previously and select **Apply**.
129+
130+
1. On the next page, enter *destinationeh* as the topic. The topic refers to the hub you created in the Event Hubs namespace. Select **Apply**. Your dataflow now has the thermostat asset as its source and a hub in your Event Hubs namespace as its destination.
131+
132+
1. To add a transformation, select **Add transform (optional)**.
133+
134+
1. To rename the `Tag 10` and `temperature` fields in the incoming message, select **+ Add** in the **Rename** tile.
135+
136+
1. Add the following two transforms:
137+
138+
| Datapoint | New datapoint name |
139+
|-----------|--------------------|
140+
| "Tag 10" | Humidity |
141+
| temperature | Temperature |
142+
143+
The rename transformation looks like the following screenshot:
144+
145+
:::image type="content" source="media/tutorial-upload-telemetry-to-cloud/rename-transform.png" alt-text="Screenshot of the rename transformation.":::
146+
147+
Select **Apply**.
148+
149+
1. To add an asset ID field to the message, select the **Transforms** box in the editor and then select **+ Add** in the **New property** tile.
150+
151+
1. In the **New property** editor, enter *AssetId* as the property key, *thermostat-01* as the property value, and select **Apply**. The dataflow editor now looks like the following screenshot:
152+
153+
:::image type="content" source="media/tutorial-upload-telemetry-to-cloud/dataflow-complete.png" alt-text="Screenshot of the dataflow.":::
154+
155+
1. To start the dataflow running, enter *tutorial-dataflow* as its name and then select **Save**. After a few minutes, the **Provisioning State** changes to **Succeeded**. The dataflow is now running in your cluster.
156+
157+
Your dataflow subscribes to an MQTT topic to receive messages from the thermostat asset. It renames some of the fields in the message, and forwards the transformed messages to the event hub you created.
132158

133159
## Verify data is flowing
134160

135-
To verify that data is flowing to the cloud, you can view your Event Hubs instance in the Azure portal. You may need to wait for several minutes for the dataflow to start and for messages to flow to the event hub.
161+
To verify that data is flowing to the cloud, you can view your Event Hubs instance in the Azure portal. You might need to wait for several minutes for the dataflow to start and for messages to flow to the event hub.
136162

137163
If messages are flowing to the instance, you can see the count on incoming messages on the instance **Overview** page:
138164

0 commit comments

Comments
 (0)