Skip to content

Commit 3dd1add

Browse files
committed
Update screenshots
1 parent b07241f commit 3dd1add

20 files changed

+11
-146
lines changed

articles/iot-operations/connect-to-cloud/howto-configure-dataflow-endpoint.md

Lines changed: 1 addition & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -64,7 +64,7 @@ spec:
6464
6565
---
6666
67-
Similarly, you can create multiple dataflows that use the same MQTT endpoint for different topics.
67+
Similarly, you can create multiple dataflows that use the same MQTT endpoint for other endpoints and topics. For example, you can use the same MQTT endpoint for a dataflow that sends data to sends data to Kafka.
6868
6969
# [Portal](#tab/portal)
7070
@@ -95,6 +95,3 @@ spec:
9595
---
9696
9797
Similar to the MQTT example, you can create multiple dataflows that use the same Kafka endpoint for different topics, or the same Data Lake endpoint for different tables.
98-
99-
100-
---

articles/iot-operations/connect-to-cloud/howto-create-dataflow.md

Lines changed: 10 additions & 142 deletions
Original file line numberDiff line numberDiff line change
@@ -83,14 +83,6 @@ spec:
8383
# See destination configuration section
8484
```
8585

86-
# [Bicep](#tab/bicep)
87-
88-
The overall structure of a dataflow configuration for Bicep is as follows:
89-
90-
```bicep
91-
bicep here
92-
```
93-
9486
---
9587

9688
<!-- TODO: link to API reference -->
@@ -103,8 +95,6 @@ To configure a source for the dataflow, specify the endpoint reference and data
10395

10496
# [Portal](#tab/portal)
10597

106-
:::image type="content" source="media/howto-create-dataflow/dataflow-source-mqtt.png" alt-text="Screenshot using operations experience portal to select MQTT source endpoint.":::
107-
10898
### Use Asset as a source
10999

110100
You can use an [asset](../discover-manage-assets/overview-manage-assets.md) as the source for the dataflow. This is only available in the operations experience portal.
@@ -123,13 +113,6 @@ You can use an [asset](../discover-manage-assets/overview-manage-assets.md) as t
123113

124114
Configuring an asset as a source is only available in the operations experience portal.
125115

126-
# [Bicep](#tab/bicep)
127-
128-
TODO for bicep
129-
```bicep
130-
bicep here
131-
```
132-
133116
---
134117

135118
### Use MQTT as a source
@@ -189,7 +172,7 @@ To specify the schema, create the file and store it in the schema registry.
189172
> [!NOTE]
190173
> The only supported serialization format is JSON. The schema is optional.
191174

192-
<!-- TODO: link to schema registry docs -->
175+
For more information about schema registry, see [Understand message schemas](concept-schema-registry.md).
193176

194177
#### Shared subscriptions
195178

@@ -208,12 +191,6 @@ sourceSettings:
208191

209192
<!-- TODO: Details -->
210193

211-
# [Bicep](#tab/bicep)
212-
213-
```bicep
214-
bicep here
215-
```
216-
217194
---
218195

219196
## Configure transformation to process data
@@ -228,7 +205,7 @@ The transformation operation is where you can transform the data from the source
228205

229206
In the operations experience portal, select **Dataflow** > **Add transform (optional)**.
230207

231-
:::image type="content" source="media/howto-configure-dataflow-endpoint/dataflow-transform.png" alt-text="Screenshot using operations experience portal to add a transform to a dataflow.":::
208+
:::image type="content" source="media/howto-create-dataflow/dataflow-transform.png" alt-text="Screenshot using operations experience portal to add a transform to a dataflow.":::
232209

233210
# [Kubernetes](#tab/kubernetes)
234211

@@ -244,11 +221,6 @@ builtInTransformationSettings:
244221

245222
<!-- TODO: link to API reference -->
246223

247-
# [Bicep](#tab/bicep)
248-
249-
```bicep
250-
bicep here
251-
```
252224

253225
---
254226

@@ -260,14 +232,7 @@ Key names in the distributed state store correspond to a dataset in the dataflow
260232

261233
# [Portal](#tab/portal)
262234

263-
1. Under **Transform (optional)**, select **Enrich** > **Add**.
264-
1. Choose the datapoints to include in the dataset.
265-
1. Select or upload a reference dataset schema.
266-
1. Add an enrich condition and description.
267-
268-
:::image type="content" source="media/howto-create-dataflow/dataflow-enrich.png" alt-text="Screenshot using operations experience portal to add an enrich transform.":::
269-
270-
1. Select **Apply**.
235+
Currently, the enrich operation is not available in the operations experience portal.
271236

272237
# [Kubernetes](#tab/kubernetes)
273238

@@ -297,12 +262,6 @@ The data from the source with the `deviceId` field matching `thermostat1` has th
297262

298263
<!-- TODO: link to API reference -->
299264

300-
# [Bicep](#tab/bicep)
301-
302-
```bicep
303-
bicep here
304-
```
305-
306265
---
307266

308267
You can load sample data into the DSS by using the [DSS set tool sample](https://github.com/Azure-Samples/explore-iot-operations/tree/main/samples/dss_set).
@@ -339,12 +298,6 @@ If the `temperature` field is greater than 20, the data is passed to the next st
339298

340299
<!-- TODO: link to API reference -->
341300

342-
# [Bicep](#tab/bicep)
343-
344-
```bicep
345-
bicep here
346-
```
347-
348301
---
349302

350303
### Map: Move data from one field to another
@@ -353,12 +306,12 @@ To map the data to another field with optional conversion, you can use the `map`
353306

354307
# [Portal](#tab/portal)
355308

356-
In the operations experience portal, mapping is separated into **Compute**, **New property** and **Rename** transforms.
309+
In the operations experience portal, mapping is currently supported using **Compute** transforms.
357310

358-
1. Under **Transform (optional)**, select one of the **Compute**, **New property** or **Rename** transforms and then select **Add**.
311+
1. Under **Transform (optional)**, select **Compute** > **Add**.
359312
1. Enter the required fields and expressions.
360313

361-
:::image type="content" source="media/howto-create-dataflow/dataflow-map.png" alt-text="Screenshot using operations experience portal to add a map transform.":::
314+
:::image type="content" source="media/howto-create-dataflow/dataflow-compute.png" alt-text="Screenshot using operations experience portal to add a compute transform.":::
362315

363316
1. Select **Apply**.
364317

@@ -380,12 +333,6 @@ builtInTransformationSettings:
380333

381334
<!-- TODO: link to API reference -->
382335

383-
# [Bicep](#tab/bicep)
384-
385-
```bicep
386-
bicep here
387-
```
388-
389336
---
390337

391338
To learn more, see [Map data by using dataflows](concept-dataflow-mapping.md) and [Convert data by using dataflows](concept-dataflow-conversions.md).
@@ -407,9 +354,9 @@ builtInTransformationSettings:
407354
schemaRef: aio-sr://<NAMESPACE>/<SCHEMA>:<VERSION>
408355
```
409356

410-
To specify the schema, you can create a Schema CR with the schema definition.
357+
To specify the schema, you can create a Schema custom resource with the schema definition.
411358

412-
<!-- TODO: link to schema registry docs -->
359+
For more information about schema registry, see [Understand message schemas](concept-schema-registry.md).
413360

414361

415362
```json
@@ -424,12 +371,6 @@ To specify the schema, you can create a Schema CR with the schema definition.
424371
}
425372
```
426373

427-
# [Bicep](#tab/bicep)
428-
429-
```bicep
430-
bicep here
431-
```
432-
433374
---
434375

435376
Supported serialization formats are JSON, Parquet, and Delta.
@@ -442,12 +383,10 @@ To configure a destination for the dataflow, specify the endpoint reference and
442383

443384
1. Select the dataflow endpoint to use as the destination.
444385

445-
:::image type="content" source="media/howto-create-dataflow/dataflow-destination.png" alt-text="Screenshot using operations experience portal to select MQTT destination endpoint.":::
386+
:::image type="content" source="media/howto-create-dataflow/dataflow-destination.png" alt-text="Screenshot using operations experience portal to select Event Hubs destination endpoint.":::
446387

447388
1. Select **Proceed** to configure the destination.
448-
1. Add the mapping details based on they type of destination.
449-
450-
389+
1. Add the mapping details based on the type of destination.
451390

452391
# [Kubernetes](#tab/kubernetes)
453392

@@ -513,76 +452,12 @@ spec:
513452

514453
<!-- TODO: add links to examples in the reference docs -->
515454

516-
# [Bicep](#tab/bicep)
517-
518-
```bicep
519-
bicep here
520-
```
521-
522455
---
523456

524457
## Verify a dataflow is working
525458

526459
Follow [Tutorial: Bi-directional MQTT bridge to Azure Event Grid](tutorial-mqtt-bridge.md) to verify the dataflow is working.
527460

528-
529-
## Manage dataflows
530-
531-
After you've created a dataflow, you can manage it using the operations experience portal or by updating the Dataflow CR.
532-
533-
### Enable/disable dataflow
534-
535-
To enable or disable a dataflow, you can use the operations experience portal or by updating the Dataflow custom resource.
536-
537-
# [Portal](#tab/portal)
538-
539-
:::image type="content" source="media/howto-create-dataflow/dataflow-enable.png" alt-text="Screenshot using operations experience portal to disable a dataflow.":::
540-
541-
# [Kubernetes](#tab/kubernetes)
542-
543-
```yaml
544-
spec:
545-
mode: Disabled
546-
```
547-
548-
# [Bicep](#tab/bicep)
549-
550-
```bicep
551-
bicep here
552-
```
553-
554-
---
555-
556-
### View dataflow health status and metrics
557-
558-
You can view the health status and metrics of the dataflow in the operations experience portal.
559-
560-
<!-- TODO: link to relevant observability docs -->
561-
562-
### Delete dataflow
563-
564-
To delete a dataflow, you can use the operations experience portal or by deleting the Dataflow custom resource.
565-
566-
# [Portal](#tab/portal)
567-
568-
Select the dataflow you want to delete and select **Delete** from the toolbar.
569-
570-
:::image type="content" source="media/howto-create-dataflow/delete-dataflow.png" alt-text="Screenshot using operations experience portal to delete a dataflow.":::
571-
572-
# [Kubernetes](#tab/kubernetes)
573-
574-
```bash
575-
kubectl delete dataflow my-dataflow
576-
```
577-
578-
# [Bicep](#tab/bicep)
579-
580-
```bicep
581-
bicep here
582-
```
583-
584-
---
585-
586461
### Export dataflow configuration
587462

588463
To export the dataflow configuration, you can use the operations experience portal or by exporting the Dataflow custom resource.
@@ -599,11 +474,4 @@ Select the dataflow you want to export and select **Export** from the toolbar.
599474
kubectl get dataflow my-dataflow -o yaml > my-dataflow.yaml
600475
```
601476

602-
# [Bicep](#tab/bicep)
603-
604-
TODO for bicep
605-
```bicep
606-
bicep here
607-
```
608-
609477
---
-116 KB
Loading
-119 KB
Loading
8.68 KB
Loading

0 commit comments

Comments
 (0)