You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/iot-operations/connect-to-cloud/howto-create-dataflow.md
+22-7Lines changed: 22 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -58,6 +58,8 @@ To create a dataflow in the operations experience portal, select **Dataflow** >
58
58
59
59
:::image type="content" source="media/howto-create-dataflow/create-dataflow.png" alt-text="Screenshot using operations experience portal to create a dataflow.":::
60
60
61
+
# [Bicep](#tab/bicep)
62
+
61
63
# [Kubernetes](#tab/kubernetes)
62
64
63
65
The overall structure of a dataflow configuration is as follows:
@@ -109,6 +111,8 @@ You can use an [asset](../discover-manage-assets/overview-manage-assets.md) as t
109
111
110
112
1. Select **Apply** to use the asset as the source endpoint.
111
113
114
+
# [Bicep](#tab/bicep)
115
+
112
116
# [Kubernetes](#tab/kubernetes)
113
117
114
118
Configuring an asset as a source is only available in the operations experience portal.
@@ -127,6 +131,8 @@ Configuring an asset as a source is only available in the operations experience
127
131
128
132
1. Select **Apply**.
129
133
134
+
# [Bicep](#tab/bicep)
135
+
130
136
# [Kubernetes](#tab/kubernetes)
131
137
132
138
For example, to configure a source using an MQTT endpoint and two MQTT topic filters, use the following configuration:
@@ -207,6 +213,8 @@ In the operations experience portal, select **Dataflow** > **Add transform (opti
207
213
208
214
:::image type="content" source="media/howto-create-dataflow/dataflow-transform.png" alt-text="Screenshot using operations experience portal to add a transform to a dataflow.":::
209
215
216
+
# [Bicep](#tab/bicep)
217
+
210
218
# [Kubernetes](#tab/kubernetes)
211
219
212
220
```yaml
@@ -234,6 +242,8 @@ Key names in the distributed state store correspond to a dataset in the dataflow
234
242
235
243
Currently, the enrich operation isn't available in the operations experience portal.
236
244
245
+
# [Bicep](#tab/bicep)
246
+
237
247
# [Kubernetes](#tab/kubernetes)
238
248
239
249
For example, you could use the `deviceId` field in the source data to match the `asset` field in the dataset:
@@ -282,6 +292,8 @@ To filter the data on a condition, you can use the `filter` stage. The condition
282
292
283
293
1. Select **Apply**.
284
294
295
+
# [Bicep](#tab/bicep)#
296
+
285
297
# [Kubernetes](#tab/kubernetes)
286
298
287
299
For example, you could use the `temperature` field in the source data to filter the data:
@@ -315,6 +327,8 @@ In the operations experience portal, mapping is currently supported using **Comp
315
327
316
328
1. Select **Apply**.
317
329
330
+
# [Bicep](#tab/bicep)
331
+
318
332
# [Kubernetes](#tab/kubernetes)
319
333
320
334
For example, you could use the `temperature` field in the source data to convert the temperature to Celsius and store it in the `temperatureCelsius` field. You could also enrich the source data with the `location` field from the contextualization dataset:
@@ -345,6 +359,8 @@ If you want to serialize the data before sending it to the destination, you need
345
359
346
360
Specify the **Output** schema when you add the destination dataflow endpoint.
347
361
362
+
# [Bicep](#tab/bicep)
363
+
348
364
# [Kubernetes](#tab/kubernetes)
349
365
350
366
@@ -388,6 +404,8 @@ To configure a destination for the dataflow, specify the endpoint reference and
388
404
1. Select **Proceed** to configure the destination.
389
405
1. Add the mapping details based on the type of destination.
390
406
407
+
# [Bicep](#tab/bicep)
408
+
391
409
# [Kubernetes](#tab/kubernetes)
392
410
393
411
For example, to configure a destination using the MQTT endpoint created earlier and a static MQTT topic, use the following configuration:
@@ -398,13 +416,8 @@ destinationSettings:
398
416
dataDestination: factory
399
417
```
400
418
401
-
If you've created storage endpoints like Microsoft Fabric, use the data destination field to specify the table or container name:
402
-
403
-
```yaml
404
-
destinationSettings:
405
-
endpointRef: adls
406
-
dataDestination: telemetryTable
407
-
```
419
+
> [!IMPORTANT]
420
+
> If you've created storage endpoints like Microsoft Fabric, storage endpoints require a schema reference. Use bicep to specify the schema reference.
408
421
409
422
## Example
410
423
@@ -468,6 +481,8 @@ Select the dataflow you want to export and select **Export** from the toolbar.
468
481
469
482
:::image type="content" source="media/howto-create-dataflow/dataflow-export.png" alt-text="Screenshot using operations experience portal to export a dataflow.":::
0 commit comments