Skip to content

Commit ac00c88

Browse files
committed
Add bicep tabs
1 parent af2de73 commit ac00c88

File tree

1 file changed

+22
-7
lines changed

1 file changed

+22
-7
lines changed

articles/iot-operations/connect-to-cloud/howto-create-dataflow.md

Lines changed: 22 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -58,6 +58,8 @@ To create a dataflow in the operations experience portal, select **Dataflow** >
5858

5959
:::image type="content" source="media/howto-create-dataflow/create-dataflow.png" alt-text="Screenshot using operations experience portal to create a dataflow.":::
6060

61+
# [Bicep](#tab/bicep)
62+
6163
# [Kubernetes](#tab/kubernetes)
6264

6365
The overall structure of a dataflow configuration is as follows:
@@ -109,6 +111,8 @@ You can use an [asset](../discover-manage-assets/overview-manage-assets.md) as t
109111

110112
1. Select **Apply** to use the asset as the source endpoint.
111113

114+
# [Bicep](#tab/bicep)
115+
112116
# [Kubernetes](#tab/kubernetes)
113117

114118
Configuring an asset as a source is only available in the operations experience portal.
@@ -127,6 +131,8 @@ Configuring an asset as a source is only available in the operations experience
127131

128132
1. Select **Apply**.
129133

134+
# [Bicep](#tab/bicep)
135+
130136
# [Kubernetes](#tab/kubernetes)
131137

132138
For example, to configure a source using an MQTT endpoint and two MQTT topic filters, use the following configuration:
@@ -207,6 +213,8 @@ In the operations experience portal, select **Dataflow** > **Add transform (opti
207213

208214
:::image type="content" source="media/howto-create-dataflow/dataflow-transform.png" alt-text="Screenshot using operations experience portal to add a transform to a dataflow.":::
209215

216+
# [Bicep](#tab/bicep)
217+
210218
# [Kubernetes](#tab/kubernetes)
211219

212220
```yaml
@@ -234,6 +242,8 @@ Key names in the distributed state store correspond to a dataset in the dataflow
234242

235243
Currently, the enrich operation isn't available in the operations experience portal.
236244

245+
# [Bicep](#tab/bicep)
246+
237247
# [Kubernetes](#tab/kubernetes)
238248

239249
For example, you could use the `deviceId` field in the source data to match the `asset` field in the dataset:
@@ -282,6 +292,8 @@ To filter the data on a condition, you can use the `filter` stage. The condition
282292

283293
1. Select **Apply**.
284294

295+
# [Bicep](#tab/bicep)#
296+
285297
# [Kubernetes](#tab/kubernetes)
286298

287299
For example, you could use the `temperature` field in the source data to filter the data:
@@ -315,6 +327,8 @@ In the operations experience portal, mapping is currently supported using **Comp
315327

316328
1. Select **Apply**.
317329

330+
# [Bicep](#tab/bicep)
331+
318332
# [Kubernetes](#tab/kubernetes)
319333

320334
For example, you could use the `temperature` field in the source data to convert the temperature to Celsius and store it in the `temperatureCelsius` field. You could also enrich the source data with the `location` field from the contextualization dataset:
@@ -345,6 +359,8 @@ If you want to serialize the data before sending it to the destination, you need
345359

346360
Specify the **Output** schema when you add the destination dataflow endpoint.
347361

362+
# [Bicep](#tab/bicep)
363+
348364
# [Kubernetes](#tab/kubernetes)
349365

350366

@@ -388,6 +404,8 @@ To configure a destination for the dataflow, specify the endpoint reference and
388404
1. Select **Proceed** to configure the destination.
389405
1. Add the mapping details based on the type of destination.
390406

407+
# [Bicep](#tab/bicep)
408+
391409
# [Kubernetes](#tab/kubernetes)
392410

393411
For example, to configure a destination using the MQTT endpoint created earlier and a static MQTT topic, use the following configuration:
@@ -398,13 +416,8 @@ destinationSettings:
398416
dataDestination: factory
399417
```
400418

401-
If you've created storage endpoints like Microsoft Fabric, use the data destination field to specify the table or container name:
402-
403-
```yaml
404-
destinationSettings:
405-
endpointRef: adls
406-
dataDestination: telemetryTable
407-
```
419+
> [!IMPORTANT]
420+
> If you've created storage endpoints like Microsoft Fabric, storage endpoints require a schema reference. Use bicep to specify the schema reference.
408421

409422
## Example
410423

@@ -468,6 +481,8 @@ Select the dataflow you want to export and select **Export** from the toolbar.
468481

469482
:::image type="content" source="media/howto-create-dataflow/dataflow-export.png" alt-text="Screenshot using operations experience portal to export a dataflow.":::
470483

484+
# [Bicep](#tab/bicep)
485+
471486
# [Kubernetes](#tab/kubernetes)
472487

473488
```bash

0 commit comments

Comments
 (0)