You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/iot-operations/connect-to-cloud/howto-configure-adlsv2-endpoint.md
+7-7Lines changed: 7 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -41,7 +41,7 @@ Then, go to the Azure Storage account > **Access control (IAM)** > **Add role as
41
41
42
42
## Create data flow endpoint for Azure Data Lake Storage Gen2
43
43
44
-
# [Portal](#tab/portal)
44
+
# [Operations experience](#tab/portal)
45
45
46
46
1. In the IoT Operations portal, select the **Data flow endpoints** tab.
47
47
1. Under **Create new data flow endpoint**, select **Azure Data Lake Storage (2nd generation)** > **New**.
@@ -137,7 +137,7 @@ Follow the steps in the [access token](#access-token) section to get a SAS token
137
137
138
138
Then, create the *DataflowEndpoint* resource and specify the access token authentication method. Here, replace `<SAS_SECRET_NAME>` with name of the secret containing the SAS token and other placeholder values.
139
139
140
-
# [Portal](#tab/portal)
140
+
# [Operations experience](#tab/portal)
141
141
142
142
See the [access token](#access-token) section for steps to create a secret in the operations experience web UI.
143
143
@@ -229,7 +229,7 @@ Before you configure the data flow endpoint, assign a role to the Azure IoT Oper
229
229
230
230
Then, configure the data flow endpoint with system-assigned managed identity settings.
231
231
232
-
# [Portal](#tab/portal)
232
+
# [Operations experience](#tab/portal)
233
233
234
234
In the operations experience data flow endpoint settings page, select the **Basic** tab then choose **Authentication method** > **System assigned managed identity**.
235
235
@@ -259,7 +259,7 @@ dataLakeStorageSettings:
259
259
260
260
If you need to override the system-assigned managed identity audience, you can specify the `audience` setting.
261
261
262
-
# [Portal](#tab/portal)
262
+
# [Operations experience](#tab/portal)
263
263
264
264
In most cases, you don't need to specify a service audience. Not specifying an audience creates a managed identity with the default audience scoped to your storage account.
265
265
@@ -300,7 +300,7 @@ Before you configure the data flow endpoint, assign a role to the user-assigned
300
300
301
301
Then, configure the data flow endpoint with user-assigned managed identity settings.
302
302
303
-
# [Portal](#tab/portal)
303
+
# [Operations experience](#tab/portal)
304
304
305
305
In the operations experience data flow endpoint settings page, select the **Basic** tab then choose **Authentication method** > **User assigned managed identity**.
306
306
@@ -353,7 +353,7 @@ Get a [SAS token](../../storage/common/storage-sas-overview.md) for an Azure Dat
353
353
354
354
To enhance security and follow the principle of least privilege, you can generate a SAS token for a specific container. To prevent authentication errors, ensure that the container specified in the SAS token matches the data flow destination setting in the configuration.
355
355
356
-
# [Portal](#tab/portal)
356
+
# [Operations experience](#tab/portal)
357
357
358
358
> [!IMPORTANT]
359
359
> To use the operations experience web UI to manage secrets, Azure IoT Operations must first be enabled with secure settings by configuring an Azure Key Vault and enabling workload identities. To learn more, see [Enable secure settings in Azure IoT Operations deployment](../deploy-iot-ops/howto-enable-secure-settings.md).
@@ -420,7 +420,7 @@ Use the `batching` settings to configure the maximum number of messages and the
420
420
421
421
For example, to configure the maximum number of messages to 1000 and the maximum latency to 100 seconds, use the following settings:
422
422
423
-
# [Portal](#tab/portal)
423
+
# [Operations experience](#tab/portal)
424
424
425
425
In the operations experience, select the **Advanced** tab for the data flow endpoint.
Copy file name to clipboardExpand all lines: articles/iot-operations/connect-to-cloud/howto-configure-adx-endpoint.md
+6-6Lines changed: 6 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,7 +6,7 @@ ms.author: patricka
6
6
ms.service: azure-iot-operations
7
7
ms.subservice: azure-data-flows
8
8
ms.topic: how-to
9
-
ms.date: 11/04/2024
9
+
ms.date: 04/03/2025
10
10
ai-usage: ai-assisted
11
11
12
12
#CustomerIntent: As an operator, I want to understand how to configure data flow endpoints for Azure Data Explorer in Azure IoT Operations so that I can send data to Azure Data Explorer.
@@ -65,7 +65,7 @@ If using system-assigned managed identity, in Azure portal, go to your Azure IoT
65
65
66
66
<!-- TODO: use the data ingest URI for host? -->
67
67
68
-
# [Portal](#tab/portal)
68
+
# [Operations experience](#tab/portal)
69
69
70
70
1. In the operations experience, select the **Data flow endpoints** tab.
71
71
1. Under **Create new data flow endpoint**, select **Azure Data Explorer** > **New**.
@@ -172,7 +172,7 @@ Before you configure the data flow endpoint, assign a role to the Azure IoT Oper
172
172
173
173
Then, configure the data flow endpoint with system-assigned managed identity settings.
174
174
175
-
# [Portal](#tab/portal)
175
+
# [Operations experience](#tab/portal)
176
176
177
177
In the operations experience data flow endpoint settings page, select the **Basic** tab then choose **Authentication method** > **System assigned managed identity**.
178
178
@@ -201,7 +201,7 @@ dataExplorerSettings:
201
201
If you need to override the system-assigned managed identity audience, you can specify the `audience` setting.
202
202
203
203
204
-
# [Portal](#tab/portal)
204
+
# [Operations experience](#tab/portal)
205
205
206
206
In most cases, you don't need to specify other settings. This configuration creates a managed identity with the default audience `https://api.kusto.windows.net`.
207
207
@@ -242,7 +242,7 @@ Before you configure the data flow endpoint, assign a role to the user-assigned
242
242
243
243
Then, configure the data flow endpoint with user-assigned managed identity settings.
244
244
245
-
# [Portal](#tab/portal)
245
+
# [Operations experience](#tab/portal)
246
246
247
247
In the operations experience data flow endpoint settings page, select the **Basic** tab then choose **Authentication method** > **User assigned managed identity**.
248
248
@@ -294,7 +294,7 @@ Use the `batching` settings to configure the maximum number of messages and the
294
294
295
295
For example, to configure the maximum number of messages to 1000 and the maximum latency to 100 seconds, use the following settings:
296
296
297
-
# [Portal](#tab/portal)
297
+
# [Operations experience](#tab/portal)
298
298
299
299
In the operations experience, select the **Advanced** tab for the data flow endpoint.
Copy file name to clipboardExpand all lines: articles/iot-operations/connect-to-cloud/howto-configure-dataflow-endpoint.md
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,7 +6,7 @@ ms.author: patricka
6
6
ms.service: azure-iot-operations
7
7
ms.subservice: azure-data-flows
8
8
ms.topic: how-to
9
-
ms.date: 11/01/2024
9
+
ms.date: 04/03/2025
10
10
11
11
#CustomerIntent: As an operator, I want to understand how to configure source and destination endpoints so that I can create a data flow.
12
12
---
@@ -59,7 +59,7 @@ To make it easier to reuse endpoints, the MQTT or Kafka topic filter isn't part
59
59
60
60
For example, you can use the default MQTT broker data flow endpoint. You can use it for both the source and destination with different topic filters:
61
61
62
-
# [Portal](#tab/portal)
62
+
# [Operations experience](#tab/portal)
63
63
64
64
:::image type="content" source="media/howto-configure-dataflow-endpoint/create-dataflow-mq-mq.png" alt-text="Screenshot using operations experience to create a data flow from MQTT to MQTT.":::
65
65
@@ -123,7 +123,7 @@ spec:
123
123
124
124
Similarly, you can create multiple data flows that use the same MQTT endpoint for other endpoints and topics. For example, you can use the same MQTT endpoint for a data flow that sends data to an Event Hubs endpoint.
125
125
126
-
# [Portal](#tab/portal)
126
+
# [Operations experience](#tab/portal)
127
127
128
128
:::image type="content" source="media/howto-configure-dataflow-endpoint/create-dataflow-mq-kafka.png" alt-text="Screenshot using operations experience to create a data flow from MQTT to Kafka.":::
Copy file name to clipboardExpand all lines: articles/iot-operations/connect-to-cloud/howto-configure-fabric-endpoint.md
+7-7Lines changed: 7 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,7 +6,7 @@ ms.author: patricka
6
6
ms.service: azure-iot-operations
7
7
ms.subservice: azure-data-flows
8
8
ms.topic: how-to
9
-
ms.date: 11/11/2024
9
+
ms.date: 04/03/2025
10
10
ai-usage: ai-assisted
11
11
12
12
#CustomerIntent: As an operator, I want to understand how to configure data flow endpoints for Microsoft Fabric OneLake in Azure IoT Operations so that I can send data to Microsoft Fabric OneLake.
@@ -43,7 +43,7 @@ Go to Microsoft Fabric workspace you created, select **Manage access** > **+ Add
43
43
44
44
## Create data flow endpoint for Microsoft Fabric OneLake
45
45
46
-
# [Portal](#tab/portal)
46
+
# [Operations experience](#tab/portal)
47
47
48
48
1. In the operations experience, select the **Data flow endpoints** tab.
49
49
1. Under **Create new data flow endpoint**, select **Microsoft Fabric OneLake** > **New**.
@@ -149,7 +149,7 @@ kubectl apply -f <FILE>.yaml
149
149
150
150
The `oneLakePathType` setting determines the type of path to use in the OneLake path. The default value is `Tables`, which is the recommended path type for the most common use cases. The `Tables` path type is a table in the OneLake lakehouse that is used to store the data. It can also be set as `Files`, which is a file in the OneLake lakehouse that is used to store the data. The `Files` path type is useful when you want to store the data in a file format that isn't supported by the `Tables` path type.
151
151
152
-
# [Portal](#tab/portal)
152
+
# [Operations experience](#tab/portal)
153
153
154
154
The OneLake path type is set in the **Basic** tab for the data flow endpoint.
155
155
@@ -186,7 +186,7 @@ Before you configure the data flow endpoint, assign a role to the Azure IoT Oper
186
186
187
187
Then, configure the data flow endpoint with system-assigned managed identity settings.
188
188
189
-
# [Portal](#tab/portal)
189
+
# [Operations experience](#tab/portal)
190
190
191
191
In the operations experience data flow endpoint settings page, select the **Basic** tab then choose **Authentication method** > **System assigned managed identity**.
192
192
@@ -215,7 +215,7 @@ fabricOneLakeSettings:
215
215
216
216
If you need to override the system-assigned managed identity audience, you can specify the `audience` setting.
217
217
218
-
# [Portal](#tab/portal)
218
+
# [Operations experience](#tab/portal)
219
219
220
220
In most cases, you don't need to specify a service audience. Not specifying an audience creates a managed identity with the default audience scoped to your storage account.
221
221
@@ -256,7 +256,7 @@ Before you configure the data flow endpoint, assign a role to the user-assigned
256
256
257
257
Then, configure the data flow endpoint with user-assigned managed identity settings.
258
258
259
-
# [Portal](#tab/portal)
259
+
# [Operations experience](#tab/portal)
260
260
261
261
In the operations experience data flow endpoint settings page, select the **Basic** tab then choose **Authentication method** > **User assigned managed identity**.
262
262
@@ -312,7 +312,7 @@ Use the `batching` settings to configure the maximum number of messages and the
312
312
313
313
For example, to configure the maximum number of messages to 1000 and the maximum latency to 100 seconds, use the following settings:
314
314
315
-
# [Portal](#tab/portal)
315
+
# [Operations experience](#tab/portal)
316
316
317
317
In the operations experience, select the **Advanced** tab for the data flow endpoint.
Copy file name to clipboardExpand all lines: articles/iot-operations/connect-to-cloud/howto-configure-fabric-real-time-intelligence.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,7 +6,7 @@ ms.author: patricka
6
6
ms.service: azure-iot-operations
7
7
ms.subservice: azure-data-flows
8
8
ms.topic: how-to
9
-
ms.date: 10/30/2024
9
+
ms.date: 04/03/2025
10
10
ai-usage: ai-assisted
11
11
12
12
#CustomerIntent: As an operator, I want to understand how to configure data flow endpoints for Microsoft Fabric Real-Time Intelligence in Azure IoT Operations so that I can send real-time data to Microsoft Fabric.
@@ -47,7 +47,7 @@ To configure a data flow endpoint for Microsoft Fabric Real-Time Intelligence, y
47
47
48
48
Azure Key Vault is the recommended way to sync the connection string to the Kubernetes cluster so that it can be referenced in the data flow. [Secure settings](../deploy-iot-ops/howto-enable-secure-settings.md) must be enabled to configure this endpoint using the operations experience web UI.
49
49
50
-
# [Portal](#tab/portal)
50
+
# [Operations experience](#tab/portal)
51
51
52
52
1. In the IoT Operations portal, select the **Data flow endpoints** tab.
53
53
1. Under **Create new data flow endpoint**, select **Microsoft Fabric Real-Time Intelligence** > **New**.
Copy file name to clipboardExpand all lines: articles/iot-operations/connect-to-cloud/howto-configure-kafka-endpoint.md
+19-19Lines changed: 19 additions & 19 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,7 +6,7 @@ ms.author: patricka
6
6
ms.service: azure-iot-operations
7
7
ms.subservice: azure-data-flows
8
8
ms.topic: how-to
9
-
ms.date: 11/07/2024
9
+
ms.date: 04/03/2025
10
10
ai-usage: ai-assisted
11
11
12
12
#CustomerIntent: As an operator, I want to understand how to configure data flow endpoints for Kafka in Azure IoT Operations so that I can send data to and from Kafka endpoints.
@@ -51,7 +51,7 @@ Then, go to the Event Hubs namespace > **Access control (IAM)** > **Add role ass
51
51
52
52
Once the Azure Event Hubs namespace and event hub is configured, you can create a data flow endpoint for the Kafka-enabled Azure Event Hubs namespace.
53
53
54
-
# [Portal](#tab/portal)
54
+
# [Operations experience](#tab/portal)
55
55
56
56
1. In the [operations experience](https://iotoperations.azure.com/), select the **Data flow endpoints** tab.
57
57
1. Under **Create new data flow endpoint**, select **Azure Event Hubs** > **New**.
@@ -148,7 +148,7 @@ kubectl apply -f <FILE>.yaml
148
148
149
149
#### Use connection string for authentication to Event Hubs
150
150
151
-
# [Portal](#tab/portal)
151
+
# [Operations experience](#tab/portal)
152
152
153
153
> [!IMPORTANT]
154
154
> To use the operations experience web UI to manage secrets, Azure IoT Operations must first be enabled with secure settings by configuring an Azure Key Vault and enabling workload identities. To learn more, see [Enable secure settings in Azure IoT Operations deployment](../deploy-iot-ops/howto-enable-secure-settings.md).
@@ -227,7 +227,7 @@ Azure Event Hubs [doesn't support all the compression types that Kafka supports]
227
227
228
228
To configure a data flow endpoint for non-Event-Hub Kafka brokers, set the host, TLS, authentication, and other settings as needed.
229
229
230
-
# [Portal](#tab/portal)
230
+
# [Operations experience](#tab/portal)
231
231
232
232
1. In the [operations experience](https://iotoperations.azure.com/), select the **Data flow endpoints** tab.
233
233
1. Under **Create new data flow endpoint**, select **Custom Kafka Broker** > **New**.
@@ -322,7 +322,7 @@ Before you configure the data flow endpoint, assign a role to the Azure IoT Oper
322
322
323
323
Then, configure the data flow endpoint with system-assigned managed identity settings.
324
324
325
-
# [Portal](#tab/portal)
325
+
# [Operations experience](#tab/portal)
326
326
327
327
In the operations experience data flow endpoint settings page, select the **Basic** tab then choose **Authentication method** > **System assigned managed identity**.
328
328
@@ -351,7 +351,7 @@ kafkaSettings:
351
351
352
352
This configuration creates a managed identity with the default audience, which is the same as the Event Hubs namespace host value in the form of `https://<NAMESPACE>.servicebus.windows.net`. However, if you need to override the default audience, you can set the `audience` field to the desired value.
353
353
354
-
# [Portal](#tab/portal)
354
+
# [Operations experience](#tab/portal)
355
355
356
356
Not supported in the operations experience.
357
357
@@ -392,7 +392,7 @@ Before you configure the data flow endpoint, assign a role to the user-assigned
392
392
393
393
Then, configure the data flow endpoint with user-assigned managed identity settings.
394
394
395
-
# [Portal](#tab/portal)
395
+
# [Operations experience](#tab/portal)
396
396
397
397
In the operations experience data flow endpoint settings page, select the **Basic** tab then choose **Authentication method** > **User assigned managed identity**.
398
398
@@ -436,7 +436,7 @@ Here, the scope is the audience of the managed identity. The default value is th
436
436
437
437
To use SASL for authentication, specify the SASL authentication method and configure SASL type and a secret reference with the name of the secret that contains the SASL token.
438
438
439
-
# [Portal](#tab/portal)
439
+
# [Operations experience](#tab/portal)
440
440
441
441
In the operations experience data flow endpoint settings page, select the **Basic** tab then choose **Authentication method** > **SASL**.
442
442
@@ -494,7 +494,7 @@ The secret must be in the same namespace as the Kafka data flow endpoint. The se
494
494
495
495
To use anonymous authentication, update the authentication section of the Kafka settings to use the Anonymous method.
496
496
497
-
# [Portal](#tab/portal)
497
+
# [Operations experience](#tab/portal)
498
498
499
499
In the operations experience data flow endpoint settings page, select the **Basic** tab then choose **Authentication method** > **None**.
500
500
@@ -524,7 +524,7 @@ kafkaSettings:
524
524
525
525
You can set advanced settings for the Kafka data flow endpoint such as TLS, trusted CA certificate, Kafka messaging settings, batching, and CloudEvents. You can set these settings in the data flow endpoint **Advanced** portal tab or within the data flow endpoint resource.
526
526
527
-
# [Portal](#tab/portal)
527
+
# [Operations experience](#tab/portal)
528
528
529
529
In the operations experience, select the **Advanced** tab for the data flow endpoint.
530
530
@@ -585,7 +585,7 @@ kafkaSettings:
585
585
586
586
To enable or disable TLS for the Kafka endpoint, update the `mode` setting in the TLS settings.
587
587
588
-
# [Portal](#tab/portal)
588
+
# [Operations experience](#tab/portal)
589
589
590
590
In the operations experience data flow endpoint settings page, select the **Advanced** tab then use the checkbox next to **TLS mode enabled**.
591
591
@@ -615,7 +615,7 @@ The TLS mode can be set to `Enabled` or `Disabled`. If the mode is set to `Enabl
615
615
616
616
Configure the trusted CA certificate for the Kafka endpoint to establish a secure connection to the Kafka broker. This setting is important if the Kafka broker uses a self-signed certificate or a certificate signed by a custom CA that isn't trusted by default.
617
617
618
-
# [Portal](#tab/portal)
618
+
# [Operations experience](#tab/portal)
619
619
620
620
In the operations experience data flow endpoint settings page, select the **Advanced** tab then use the **Trusted CA certificate config map** field to specify the ConfigMap containing the trusted CA certificate.
621
621
@@ -655,7 +655,7 @@ The consumer group ID is used to identify the consumer group that the data flow
655
655
> [!IMPORTANT]
656
656
> When the Kafka endpoint is used as [source](howto-create-dataflow.md#source), the consumer group ID is required. Otherwise, the data flow can't read messages from the Kafka topic, and you get an error "Kafka type source endpoints must have a consumerGroupId defined".
657
657
658
-
# [Portal](#tab/portal)
658
+
# [Operations experience](#tab/portal)
659
659
660
660
In the operations experience data flow endpoint settings page, select the **Advanced** tab then use the **Consumer group ID** field to specify the consumer group ID.
661
661
@@ -692,7 +692,7 @@ The compression field enables compression for the messages sent to Kafka topics.
692
692
693
693
To configure compression:
694
694
695
-
# [Portal](#tab/portal)
695
+
# [Operations experience](#tab/portal)
696
696
697
697
In the operations experience data flow endpoint settings page, select the **Advanced** tab then use the **Compression** field to specify the compression type.
698
698
@@ -730,7 +730,7 @@ For example, if you set latencyMs to 1000, maxMessages to 100, and maxBytes to 1
730
730
731
731
To configure batching:
732
732
733
-
# [Portal](#tab/portal)
733
+
# [Operations experience](#tab/portal)
734
734
735
735
In the operations experience data flow endpoint settings page, select the **Advanced** tab then use the **Batching enabled** field to enable batching. Use the **Batching latency**, **Maximum bytes**, and **Message count** fields to specify the batching settings.
736
736
@@ -781,7 +781,7 @@ For example, if you set the partition handling strategy to `Property` and the pa
781
781
782
782
To configure the partition handling strategy:
783
783
784
-
# [Portal](#tab/portal)
784
+
# [Operations experience](#tab/portal)
785
785
786
786
In the operations experience data flow endpoint settings page, select the **Advanced** tab then use the **Partition handling strategy** field to specify the partition handling strategy. Use the **Partition key property** field to specify the property used for partitioning if the strategy is set to `Property`.
787
787
@@ -821,7 +821,7 @@ For example, if you set the Kafka acknowledgment to `All`, the data flow waits f
821
821
822
822
To configure the Kafka acknowledgments:
823
823
824
-
# [Portal](#tab/portal)
824
+
# [Operations experience](#tab/portal)
825
825
826
826
In the operations experience data flow endpoint settings page, select the **Advanced** tab then use the **Kafka acknowledgment** field to specify the Kafka acknowledgment level.
827
827
@@ -848,7 +848,7 @@ This setting only takes effect if the endpoint is used as a destination where th
848
848
849
849
By default, the copy MQTT properties setting is enabled. These user properties include values such as `subject` that stores the name of the asset sending the message.
850
850
851
-
# [Portal](#tab/portal)
851
+
# [Operations experience](#tab/portal)
852
852
853
853
In the operations experience data flow endpoint settings page, select the **Advanced** tab then use checkbox next to **Copy MQTT properties** field to enable or disable copying MQTT properties.
854
854
@@ -994,7 +994,7 @@ Not all event data properties including propertyEventData.correlationId are forw
994
994
995
995
The `CloudEventAttributes` options are `Propagate` or`CreateOrRemap`.
996
996
997
-
# [Portal](#tab/portal)
997
+
# [Operations experience](#tab/portal)
998
998
999
999
In the operations experience data flow endpoint settings page, select the **Advanced** tab then use the **Cloud event attributes** field to specify the CloudEvents setting.
0 commit comments