You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/iot-operations/connect-to-cloud/howto-configure-dataflow-endpoint.md
+1-4Lines changed: 1 addition & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -64,7 +64,7 @@ spec:
64
64
65
65
---
66
66
67
-
Similarly, you can create multiple dataflows that use the same MQTT endpoint for different topics.
67
+
Similarly, you can create multiple dataflows that use the same MQTT endpoint for other endpoints and topics. For example, you can use the same MQTT endpoint for a dataflow that sends data to sends data to Kafka.
68
68
69
69
# [Portal](#tab/portal)
70
70
@@ -95,6 +95,3 @@ spec:
95
95
---
96
96
97
97
Similar to the MQTT example, you can create multiple dataflows that use the same Kafka endpoint for different topics, or the same Data Lake endpoint for different tables.
The overall structure of a dataflow configuration for Bicep is as follows:
89
-
90
-
```bicep
91
-
bicep here
92
-
```
93
-
94
86
---
95
87
96
88
<!-- TODO: link to API reference -->
@@ -103,8 +95,6 @@ To configure a source for the dataflow, specify the endpoint reference and data
103
95
104
96
# [Portal](#tab/portal)
105
97
106
-
:::image type="content" source="media/howto-create-dataflow/dataflow-source-mqtt.png" alt-text="Screenshot using operations experience portal to select MQTT source endpoint.":::
107
-
108
98
### Use Asset as a source
109
99
110
100
You can use an [asset](../discover-manage-assets/overview-manage-assets.md) as the source for the dataflow. This is only available in the operations experience portal.
@@ -123,13 +113,6 @@ You can use an [asset](../discover-manage-assets/overview-manage-assets.md) as t
123
113
124
114
Configuring an asset as a source is only available in the operations experience portal.
125
115
126
-
# [Bicep](#tab/bicep)
127
-
128
-
TODO for bicep
129
-
```bicep
130
-
bicep here
131
-
```
132
-
133
116
---
134
117
135
118
### Use MQTT as a source
@@ -189,7 +172,7 @@ To specify the schema, create the file and store it in the schema registry.
189
172
> [!NOTE]
190
173
> The only supported serialization format is JSON. The schema is optional.
191
174
192
-
<!-- TODO: link to schema registry docs -->
175
+
For more information about schema registry, see [Understand message schemas](concept-schema-registry.md).
193
176
194
177
#### Shared subscriptions
195
178
@@ -208,12 +191,6 @@ sourceSettings:
208
191
209
192
<!-- TODO: Details -->
210
193
211
-
# [Bicep](#tab/bicep)
212
-
213
-
```bicep
214
-
bicep here
215
-
```
216
-
217
194
---
218
195
219
196
## Configure transformation to process data
@@ -228,7 +205,7 @@ The transformation operation is where you can transform the data from the source
228
205
229
206
In the operations experience portal, select **Dataflow** > **Add transform (optional)**.
230
207
231
-
:::image type="content" source="media/howto-configure-dataflow-endpoint/dataflow-transform.png" alt-text="Screenshot using operations experience portal to add a transform to a dataflow.":::
208
+
:::image type="content" source="media/howto-create-dataflow/dataflow-transform.png" alt-text="Screenshot using operations experience portal to add a transform to a dataflow.":::
@@ -260,14 +232,7 @@ Key names in the distributed state store correspond to a dataset in the dataflow
260
232
261
233
# [Portal](#tab/portal)
262
234
263
-
1. Under **Transform (optional)**, select **Enrich** > **Add**.
264
-
1. Choose the datapoints to include in the dataset.
265
-
1. Select or upload a reference dataset schema.
266
-
1. Add an enrich condition and description.
267
-
268
-
:::image type="content" source="media/howto-create-dataflow/dataflow-enrich.png" alt-text="Screenshot using operations experience portal to add an enrich transform.":::
269
-
270
-
1. Select **Apply**.
235
+
Currently, the enrich operation is not available in the operations experience portal.
271
236
272
237
# [Kubernetes](#tab/kubernetes)
273
238
@@ -297,12 +262,6 @@ The data from the source with the `deviceId` field matching `thermostat1` has th
297
262
298
263
<!-- TODO: link to API reference -->
299
264
300
-
# [Bicep](#tab/bicep)
301
-
302
-
```bicep
303
-
bicep here
304
-
```
305
-
306
265
---
307
266
308
267
You can load sample data into the DSS by using the [DSS set tool sample](https://github.com/Azure-Samples/explore-iot-operations/tree/main/samples/dss_set).
@@ -339,12 +298,6 @@ If the `temperature` field is greater than 20, the data is passed to the next st
339
298
340
299
<!-- TODO: link to API reference -->
341
300
342
-
# [Bicep](#tab/bicep)
343
-
344
-
```bicep
345
-
bicep here
346
-
```
347
-
348
301
---
349
302
350
303
### Map: Move data from one field to another
@@ -353,12 +306,12 @@ To map the data to another field with optional conversion, you can use the `map`
353
306
354
307
# [Portal](#tab/portal)
355
308
356
-
In the operations experience portal, mapping is separated into **Compute**, **New property** and **Rename** transforms.
309
+
In the operations experience portal, mapping is currently supported using **Compute** transforms.
357
310
358
-
1. Under **Transform (optional)**, select one of the **Compute**, **New property** or **Rename** transforms and then select **Add**.
311
+
1. Under **Transform (optional)**, select **Compute** > **Add**.
359
312
1. Enter the required fields and expressions.
360
313
361
-
:::image type="content" source="media/howto-create-dataflow/dataflow-map.png" alt-text="Screenshot using operations experience portal to add a map transform.":::
314
+
:::image type="content" source="media/howto-create-dataflow/dataflow-compute.png" alt-text="Screenshot using operations experience portal to add a compute transform.":::
To specify the schema, you can create a Schema CR with the schema definition.
357
+
To specify the schema, you can create a Schema custom resource with the schema definition.
411
358
412
-
<!-- TODO: link to schema registry docs -->
359
+
For more information about schema registry, see [Understand message schemas](concept-schema-registry.md).
413
360
414
361
415
362
```json
@@ -424,12 +371,6 @@ To specify the schema, you can create a Schema CR with the schema definition.
424
371
}
425
372
```
426
373
427
-
# [Bicep](#tab/bicep)
428
-
429
-
```bicep
430
-
bicep here
431
-
```
432
-
433
374
---
434
375
435
376
Supported serialization formats are JSON, Parquet, and Delta.
@@ -442,12 +383,10 @@ To configure a destination for the dataflow, specify the endpoint reference and
442
383
443
384
1. Select the dataflow endpoint to use as the destination.
444
385
445
-
:::image type="content" source="media/howto-create-dataflow/dataflow-destination.png" alt-text="Screenshot using operations experience portal to select MQTT destination endpoint.":::
386
+
:::image type="content" source="media/howto-create-dataflow/dataflow-destination.png" alt-text="Screenshot using operations experience portal to select Event Hubs destination endpoint.":::
446
387
447
388
1. Select **Proceed** to configure the destination.
448
-
1. Add the mapping details based on they type of destination.
449
-
450
-
389
+
1. Add the mapping details based on the type of destination.
451
390
452
391
# [Kubernetes](#tab/kubernetes)
453
392
@@ -513,76 +452,12 @@ spec:
513
452
514
453
<!-- TODO: add links to examples in the reference docs -->
515
454
516
-
# [Bicep](#tab/bicep)
517
-
518
-
```bicep
519
-
bicep here
520
-
```
521
-
522
455
---
523
456
524
457
## Verify a dataflow is working
525
458
526
459
Follow [Tutorial: Bi-directional MQTT bridge to Azure Event Grid](tutorial-mqtt-bridge.md) to verify the dataflow is working.
527
460
528
-
529
-
## Manage dataflows
530
-
531
-
After you've created a dataflow, you can manage it using the operations experience portal or by updating the Dataflow CR.
532
-
533
-
### Enable/disable dataflow
534
-
535
-
To enable or disable a dataflow, you can use the operations experience portal or by updating the Dataflow custom resource.
536
-
537
-
# [Portal](#tab/portal)
538
-
539
-
:::image type="content" source="media/howto-create-dataflow/dataflow-enable.png" alt-text="Screenshot using operations experience portal to disable a dataflow.":::
540
-
541
-
# [Kubernetes](#tab/kubernetes)
542
-
543
-
```yaml
544
-
spec:
545
-
mode: Disabled
546
-
```
547
-
548
-
# [Bicep](#tab/bicep)
549
-
550
-
```bicep
551
-
bicep here
552
-
```
553
-
554
-
---
555
-
556
-
### View dataflow health status and metrics
557
-
558
-
You can view the health status and metrics of the dataflow in the operations experience portal.
559
-
560
-
<!-- TODO: link to relevant observability docs -->
561
-
562
-
### Delete dataflow
563
-
564
-
To delete a dataflow, you can use the operations experience portal or by deleting the Dataflow custom resource.
565
-
566
-
# [Portal](#tab/portal)
567
-
568
-
Select the dataflow you want to delete and select **Delete** from the toolbar.
569
-
570
-
:::image type="content" source="media/howto-create-dataflow/delete-dataflow.png" alt-text="Screenshot using operations experience portal to delete a dataflow.":::
571
-
572
-
# [Kubernetes](#tab/kubernetes)
573
-
574
-
```bash
575
-
kubectl delete dataflow my-dataflow
576
-
```
577
-
578
-
# [Bicep](#tab/bicep)
579
-
580
-
```bicep
581
-
bicep here
582
-
```
583
-
584
-
---
585
-
586
461
### Export dataflow configuration
587
462
588
463
To export the dataflow configuration, you can use the operations experience portal or by exporting the Dataflow custom resource.
@@ -599,11 +474,4 @@ Select the dataflow you want to export and select **Export** from the toolbar.
599
474
kubectl get dataflow my-dataflow -o yaml > my-dataflow.yaml
0 commit comments