You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In this tutorial, you use a dataflow to forward messages from the MQTT broker to an event hub in the Azure Event Hubs service. The event hub can deliver the data to other cloud services for storage and analysis. In the next tutorial, you use a Real-Time Dashboard to visualize the data.
19
+
In this tutorial, you use a dataflow to forward messages from the MQTT broker to an event hub in the Azure Event Hubs service. The event hub can deliver the data to other cloud services for storage and analysis. In the next tutorial, you use a real-time dashboard to visualize the data.
20
20
21
21
## Prerequisites
22
22
@@ -100,39 +100,65 @@ az role assignment create --role "Azure Event Hubs Data Sender" --assignee $PRIN
100
100
101
101
## Create a dataflow to send telemetry to an event hub
102
102
103
-
To create and configure a dataflow in your cluster, run the following commands in your shell. This dataflow:
103
+
Use the operations experience UI to create and configure a dataflow in your cluster that:
104
104
105
105
- Renames the `Tag 10` field in the incoming message to `Humidity`.
106
106
- Renames the `temperature` field in the incoming message to `Temperature`.
107
-
- Adds a field called `AssetId` that contains the value of the `externalAssetId` message property.
107
+
- Adds a field called `AssetId` that contains the name of the asset.
108
108
- Forwards the transformed messages from the MQTT topic to the event hub you created.
109
109
110
-
<!-- TODO: Change branch to main before merging the release branch -->
110
+
To create the dataflow:
111
111
112
-
# [Bash](#tab/bash)
112
+
1. Browse to the operations experience UI and locate your instance. Then select **Dataflow endpoints** and select **+ New** in the **Azure Event Hubs** tile:
sed -i 's/<NAMESPACE>/'"${CLUSTER_NAME:0:24}"'/' dataflow.yaml
114
+
:::image type="content" source="media/tutorial-upload-telemetry-to-cloud/new-event-hubs-endpoint.png" alt-text="Screenshot of the Dataflow endpoints page.":::
117
115
118
-
kubectl apply -f dataflow.yaml
119
-
```
116
+
1. In the **Create new dataflow endpoint: Azure Event Hubs**, enter *event-hubs-target* as the name, and update the **Host** field with the address of the Event Hubs namespace you created. Select **Apply**:
120
117
121
-
# [PowerShell](#tab/powershell)
118
+
:::image type="content" source="media/tutorial-upload-telemetry-to-cloud/new-event-hubs-destination.png" alt-text="Screenshot of the Create new dataflow endpoint: Azure Event Hubs page.":::
1. Select **Dataflows** and then select **+ Create dataflow**. The **\<new-dataflow\>** page displays:
127
123
128
-
kubectl apply -f dataflow.yaml
129
-
```
124
+
:::image type="content" source="media/tutorial-upload-telemetry-to-cloud/new-dataflow.png" alt-text="Screenshot of the Dataflows page.":::
130
125
131
-
---
126
+
1. In the dataflow editor, select **Select source**. Then select the thermostat asset you created previously and select **Apply**.
127
+
128
+
1. In the dataflow editor, select **Select dataflow endpoint**. Then select the **event-hubs-target**** endpoint you created previously and select **Apply**.
129
+
130
+
1. On the next page, enter *destinationeh* as the topic. The topic refers to the hub you created in the Event Hubs namespace. Select **Apply**. Your dataflow now has the thermostat asset as its source and a hub in your Event Hubs namespace as its destination.
131
+
132
+
1. To add a transformation, select **Add transform (optional)**.
133
+
134
+
1. To rename the `Tag 10` and `temperature` fields in the incoming message, select **+ Add** in the **Rename** tile.
135
+
136
+
1. Add the following two transforms:
137
+
138
+
| Datapoint | New datapoint name |
139
+
|-----------|--------------------|
140
+
| "Tag 10" | Humidity |
141
+
| temperature | Temperature |
142
+
143
+
The rename transformation looks like the following screenshot:
144
+
145
+
:::image type="content" source="media/tutorial-upload-telemetry-to-cloud/rename-transform.png" alt-text="Screenshot of the rename transformation.":::
146
+
147
+
Select **Apply**.
148
+
149
+
1. To add an asset ID field to the message, select the **Transforms** box in the editor and then select **+ Add** in the **New property** tile.
150
+
151
+
1. In the **New property** editor, enter *AssetId* as the property key, *thermostat-01* as the property value, and select **Apply**. The dataflow editor now looks like the following screenshot:
152
+
153
+
:::image type="content" source="media/tutorial-upload-telemetry-to-cloud/dataflow-complete.png" alt-text="Screenshot of the dataflow.":::
154
+
155
+
1. To start the dataflow running, enter *tutorial-dataflow* as its name and then select **Save**. After a few minutes, the **Provisioning State** changes to **Succeeded**. The dataflow is now running in your cluster.
156
+
157
+
Your dataflow subscribes to an MQTT topic to receive messages from the thermostat asset. It renames some of the fields in the message, and forwards the transformed messages to the event hub you created.
132
158
133
159
## Verify data is flowing
134
160
135
-
To verify that data is flowing to the cloud, you can view your Event Hubs instance in the Azure portal. You may need to wait for several minutes for the dataflow to start and for messages to flow to the event hub.
161
+
To verify that data is flowing to the cloud, you can view your Event Hubs instance in the Azure portal. You might need to wait for several minutes for the dataflow to start and for messages to flow to the event hub.
136
162
137
163
If messages are flowing to the instance, you can see the count on incoming messages on the instance **Overview** page:
0 commit comments