Skip to content

Commit 93e1924

Browse files
author
Pat Altimore
committed
Add CLI tab
1 parent 7c09643 commit 93e1924

File tree

1 file changed

+74
-1
lines changed

1 file changed

+74
-1
lines changed

articles/iot-operations/connect-to-cloud/howto-configure-dataflow-endpoint.md

Lines changed: 74 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ ms.author: patricka
66
ms.service: azure-iot-operations
77
ms.subservice: azure-data-flows
88
ms.topic: how-to
9-
ms.date: 04/03/2025
9+
ms.date: 06/18/2025
1010

1111
#CustomerIntent: As an operator, I want to understand how to configure source and destination endpoints so that I can create a data flow.
1212
---
@@ -63,6 +63,42 @@ For example, you can use the default MQTT broker data flow endpoint. You can use
6363

6464
:::image type="content" source="media/howto-configure-dataflow-endpoint/create-dataflow-mq-mq.png" alt-text="Screenshot using operations experience to create a data flow from MQTT to MQTT.":::
6565

66+
# [Azure CLI](#tab/cli)
67+
68+
69+
Use the `az iot ops dataflow apply` command to create or change a data flow.
70+
71+
```azurecli
72+
az iot ops dataflow apply --resource-group <ResourceGroupName> --instance <AioInstanceName> --profile <DataflowProfileName> --name <DataflowName> --config-file <ConfigFilePathAndName>
73+
```
74+
75+
The `--config-file` parameter is the path and file name of a JSON configuration file containing the resource properties.
76+
77+
In this example, assume a configuration file named `data-flow.json` with the following content stored in the user's home directory:
78+
```json
79+
{
80+
"mode": "Enabled",
81+
"operations": [
82+
{
83+
"operationType": "Source",
84+
"sourceSettings": {
85+
"endpointRef": "default",
86+
"dataSources": [
87+
"example/topic/1"
88+
]
89+
}
90+
},
91+
{
92+
"operationType": "Destination",
93+
"destinationSettings": {
94+
"endpointRef": "default",
95+
"dataDestination": "example/topic/2"
96+
}
97+
}
98+
]
99+
}
100+
```
101+
66102
# [Bicep](#tab/bicep)
67103

68104
```bicep
@@ -127,6 +163,43 @@ Similarly, you can create multiple data flows that use the same MQTT endpoint fo
127163
128164
:::image type="content" source="media/howto-configure-dataflow-endpoint/create-dataflow-mq-kafka.png" alt-text="Screenshot using operations experience to create a data flow from MQTT to Kafka.":::
129165
166+
# [Azure CLI](#tab/cli)
167+
168+
Use the `az iot ops dataflow apply` command to create or change a data flow.
169+
170+
```azurecli
171+
az iot ops dataflow apply --resource-group <ResourceGroupName> --instance <AioInstanceName> --profile <DataflowProfileName> --name <DataflowName> --config-file <ConfigFilePathAndName>
172+
```
173+
174+
The `--config-file` parameter is the path and file name of a JSON configuration file containing the resource properties.
175+
176+
In this example, assume a configuration file named `data-flow.json` with the following content stored in the user's home directory:
177+
178+
```json
179+
{
180+
"mode": "Enabled",
181+
"operations": [
182+
{
183+
"operationType": "Source",
184+
"sourceSettings": {
185+
"endpointRef": "default",
186+
"dataSources": [
187+
"example/topic/3"
188+
]
189+
}
190+
},
191+
{
192+
"operationType": "Destination",
193+
"destinationSettings": {
194+
// The endpoint needs to be created before you can reference it here
195+
"endpointRef": "example-event-hub-endpoint",
196+
"dataDestination": "example/topic/4"
197+
}
198+
}
199+
]
200+
}
201+
```
202+
130203
# [Bicep](#tab/bicep)
131204

132205
```bicep

0 commit comments

Comments
 (0)