Skip to content

Commit f18c7d9

Browse files
committed
Update screenshots
1 parent 1b6e399 commit f18c7d9

21 files changed

+92
-129
lines changed

articles/iot-operations/connect-to-cloud/howto-configure-adlsv2-endpoint.md

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -41,7 +41,7 @@ Then, go to the Azure Storage account > **Access control (IAM)** > **Add role as
4141

4242
## Create data flow endpoint for Azure Data Lake Storage Gen2
4343

44-
# [Portal](#tab/portal)
44+
# [Operations experience](#tab/portal)
4545

4646
1. In the IoT Operations portal, select the **Data flow endpoints** tab.
4747
1. Under **Create new data flow endpoint**, select **Azure Data Lake Storage (2nd generation)** > **New**.
@@ -137,7 +137,7 @@ Follow the steps in the [access token](#access-token) section to get a SAS token
137137

138138
Then, create the *DataflowEndpoint* resource and specify the access token authentication method. Here, replace `<SAS_SECRET_NAME>` with name of the secret containing the SAS token and other placeholder values.
139139

140-
# [Portal](#tab/portal)
140+
# [Operations experience](#tab/portal)
141141

142142
See the [access token](#access-token) section for steps to create a secret in the operations experience web UI.
143143

@@ -229,7 +229,7 @@ Before you configure the data flow endpoint, assign a role to the Azure IoT Oper
229229

230230
Then, configure the data flow endpoint with system-assigned managed identity settings.
231231

232-
# [Portal](#tab/portal)
232+
# [Operations experience](#tab/portal)
233233

234234
In the operations experience data flow endpoint settings page, select the **Basic** tab then choose **Authentication method** > **System assigned managed identity**.
235235

@@ -259,7 +259,7 @@ dataLakeStorageSettings:
259259
260260
If you need to override the system-assigned managed identity audience, you can specify the `audience` setting.
261261

262-
# [Portal](#tab/portal)
262+
# [Operations experience](#tab/portal)
263263

264264
In most cases, you don't need to specify a service audience. Not specifying an audience creates a managed identity with the default audience scoped to your storage account.
265265

@@ -300,7 +300,7 @@ Before you configure the data flow endpoint, assign a role to the user-assigned
300300

301301
Then, configure the data flow endpoint with user-assigned managed identity settings.
302302

303-
# [Portal](#tab/portal)
303+
# [Operations experience](#tab/portal)
304304

305305
In the operations experience data flow endpoint settings page, select the **Basic** tab then choose **Authentication method** > **User assigned managed identity**.
306306

@@ -353,7 +353,7 @@ Get a [SAS token](../../storage/common/storage-sas-overview.md) for an Azure Dat
353353

354354
To enhance security and follow the principle of least privilege, you can generate a SAS token for a specific container. To prevent authentication errors, ensure that the container specified in the SAS token matches the data flow destination setting in the configuration.
355355

356-
# [Portal](#tab/portal)
356+
# [Operations experience](#tab/portal)
357357

358358
> [!IMPORTANT]
359359
> To use the operations experience web UI to manage secrets, Azure IoT Operations must first be enabled with secure settings by configuring an Azure Key Vault and enabling workload identities. To learn more, see [Enable secure settings in Azure IoT Operations deployment](../deploy-iot-ops/howto-enable-secure-settings.md).
@@ -420,7 +420,7 @@ Use the `batching` settings to configure the maximum number of messages and the
420420

421421
For example, to configure the maximum number of messages to 1000 and the maximum latency to 100 seconds, use the following settings:
422422

423-
# [Portal](#tab/portal)
423+
# [Operations experience](#tab/portal)
424424

425425
In the operations experience, select the **Advanced** tab for the data flow endpoint.
426426

articles/iot-operations/connect-to-cloud/howto-configure-adx-endpoint.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ ms.author: patricka
66
ms.service: azure-iot-operations
77
ms.subservice: azure-data-flows
88
ms.topic: how-to
9-
ms.date: 11/04/2024
9+
ms.date: 04/03/2025
1010
ai-usage: ai-assisted
1111

1212
#CustomerIntent: As an operator, I want to understand how to configure data flow endpoints for Azure Data Explorer in Azure IoT Operations so that I can send data to Azure Data Explorer.
@@ -65,7 +65,7 @@ If using system-assigned managed identity, in Azure portal, go to your Azure IoT
6565
6666
<!-- TODO: use the data ingest URI for host? -->
6767
68-
# [Portal](#tab/portal)
68+
# [Operations experience](#tab/portal)
6969
7070
1. In the operations experience, select the **Data flow endpoints** tab.
7171
1. Under **Create new data flow endpoint**, select **Azure Data Explorer** > **New**.
@@ -172,7 +172,7 @@ Before you configure the data flow endpoint, assign a role to the Azure IoT Oper
172172

173173
Then, configure the data flow endpoint with system-assigned managed identity settings.
174174

175-
# [Portal](#tab/portal)
175+
# [Operations experience](#tab/portal)
176176

177177
In the operations experience data flow endpoint settings page, select the **Basic** tab then choose **Authentication method** > **System assigned managed identity**.
178178

@@ -201,7 +201,7 @@ dataExplorerSettings:
201201
If you need to override the system-assigned managed identity audience, you can specify the `audience` setting.
202202

203203

204-
# [Portal](#tab/portal)
204+
# [Operations experience](#tab/portal)
205205

206206
In most cases, you don't need to specify other settings. This configuration creates a managed identity with the default audience `https://api.kusto.windows.net`.
207207

@@ -242,7 +242,7 @@ Before you configure the data flow endpoint, assign a role to the user-assigned
242242

243243
Then, configure the data flow endpoint with user-assigned managed identity settings.
244244

245-
# [Portal](#tab/portal)
245+
# [Operations experience](#tab/portal)
246246

247247
In the operations experience data flow endpoint settings page, select the **Basic** tab then choose **Authentication method** > **User assigned managed identity**.
248248

@@ -294,7 +294,7 @@ Use the `batching` settings to configure the maximum number of messages and the
294294

295295
For example, to configure the maximum number of messages to 1000 and the maximum latency to 100 seconds, use the following settings:
296296

297-
# [Portal](#tab/portal)
297+
# [Operations experience](#tab/portal)
298298

299299
In the operations experience, select the **Advanced** tab for the data flow endpoint.
300300

articles/iot-operations/connect-to-cloud/howto-configure-dataflow-endpoint.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ ms.author: patricka
66
ms.service: azure-iot-operations
77
ms.subservice: azure-data-flows
88
ms.topic: how-to
9-
ms.date: 11/01/2024
9+
ms.date: 04/03/2025
1010

1111
#CustomerIntent: As an operator, I want to understand how to configure source and destination endpoints so that I can create a data flow.
1212
---
@@ -59,7 +59,7 @@ To make it easier to reuse endpoints, the MQTT or Kafka topic filter isn't part
5959

6060
For example, you can use the default MQTT broker data flow endpoint. You can use it for both the source and destination with different topic filters:
6161

62-
# [Portal](#tab/portal)
62+
# [Operations experience](#tab/portal)
6363

6464
:::image type="content" source="media/howto-configure-dataflow-endpoint/create-dataflow-mq-mq.png" alt-text="Screenshot using operations experience to create a data flow from MQTT to MQTT.":::
6565

@@ -123,7 +123,7 @@ spec:
123123
124124
Similarly, you can create multiple data flows that use the same MQTT endpoint for other endpoints and topics. For example, you can use the same MQTT endpoint for a data flow that sends data to an Event Hubs endpoint.
125125
126-
# [Portal](#tab/portal)
126+
# [Operations experience](#tab/portal)
127127
128128
:::image type="content" source="media/howto-configure-dataflow-endpoint/create-dataflow-mq-kafka.png" alt-text="Screenshot using operations experience to create a data flow from MQTT to Kafka.":::
129129

articles/iot-operations/connect-to-cloud/howto-configure-fabric-endpoint.md

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ ms.author: patricka
66
ms.service: azure-iot-operations
77
ms.subservice: azure-data-flows
88
ms.topic: how-to
9-
ms.date: 11/11/2024
9+
ms.date: 04/03/2025
1010
ai-usage: ai-assisted
1111

1212
#CustomerIntent: As an operator, I want to understand how to configure data flow endpoints for Microsoft Fabric OneLake in Azure IoT Operations so that I can send data to Microsoft Fabric OneLake.
@@ -43,7 +43,7 @@ Go to Microsoft Fabric workspace you created, select **Manage access** > **+ Add
4343

4444
## Create data flow endpoint for Microsoft Fabric OneLake
4545

46-
# [Portal](#tab/portal)
46+
# [Operations experience](#tab/portal)
4747

4848
1. In the operations experience, select the **Data flow endpoints** tab.
4949
1. Under **Create new data flow endpoint**, select **Microsoft Fabric OneLake** > **New**.
@@ -149,7 +149,7 @@ kubectl apply -f <FILE>.yaml
149149

150150
The `oneLakePathType` setting determines the type of path to use in the OneLake path. The default value is `Tables`, which is the recommended path type for the most common use cases. The `Tables` path type is a table in the OneLake lakehouse that is used to store the data. It can also be set as `Files`, which is a file in the OneLake lakehouse that is used to store the data. The `Files` path type is useful when you want to store the data in a file format that isn't supported by the `Tables` path type.
151151

152-
# [Portal](#tab/portal)
152+
# [Operations experience](#tab/portal)
153153

154154
The OneLake path type is set in the **Basic** tab for the data flow endpoint.
155155

@@ -186,7 +186,7 @@ Before you configure the data flow endpoint, assign a role to the Azure IoT Oper
186186
187187
Then, configure the data flow endpoint with system-assigned managed identity settings.
188188
189-
# [Portal](#tab/portal)
189+
# [Operations experience](#tab/portal)
190190
191191
In the operations experience data flow endpoint settings page, select the **Basic** tab then choose **Authentication method** > **System assigned managed identity**.
192192
@@ -215,7 +215,7 @@ fabricOneLakeSettings:
215215
216216
If you need to override the system-assigned managed identity audience, you can specify the `audience` setting.
217217

218-
# [Portal](#tab/portal)
218+
# [Operations experience](#tab/portal)
219219

220220
In most cases, you don't need to specify a service audience. Not specifying an audience creates a managed identity with the default audience scoped to your storage account.
221221

@@ -256,7 +256,7 @@ Before you configure the data flow endpoint, assign a role to the user-assigned
256256

257257
Then, configure the data flow endpoint with user-assigned managed identity settings.
258258

259-
# [Portal](#tab/portal)
259+
# [Operations experience](#tab/portal)
260260

261261
In the operations experience data flow endpoint settings page, select the **Basic** tab then choose **Authentication method** > **User assigned managed identity**.
262262

@@ -312,7 +312,7 @@ Use the `batching` settings to configure the maximum number of messages and the
312312

313313
For example, to configure the maximum number of messages to 1000 and the maximum latency to 100 seconds, use the following settings:
314314

315-
# [Portal](#tab/portal)
315+
# [Operations experience](#tab/portal)
316316

317317
In the operations experience, select the **Advanced** tab for the data flow endpoint.
318318

articles/iot-operations/connect-to-cloud/howto-configure-fabric-real-time-intelligence.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ ms.author: patricka
66
ms.service: azure-iot-operations
77
ms.subservice: azure-data-flows
88
ms.topic: how-to
9-
ms.date: 10/30/2024
9+
ms.date: 04/03/2025
1010
ai-usage: ai-assisted
1111

1212
#CustomerIntent: As an operator, I want to understand how to configure data flow endpoints for Microsoft Fabric Real-Time Intelligence in Azure IoT Operations so that I can send real-time data to Microsoft Fabric.
@@ -47,7 +47,7 @@ To configure a data flow endpoint for Microsoft Fabric Real-Time Intelligence, y
4747

4848
Azure Key Vault is the recommended way to sync the connection string to the Kubernetes cluster so that it can be referenced in the data flow. [Secure settings](../deploy-iot-ops/howto-enable-secure-settings.md) must be enabled to configure this endpoint using the operations experience web UI.
4949

50-
# [Portal](#tab/portal)
50+
# [Operations experience](#tab/portal)
5151

5252
1. In the IoT Operations portal, select the **Data flow endpoints** tab.
5353
1. Under **Create new data flow endpoint**, select **Microsoft Fabric Real-Time Intelligence** > **New**.

articles/iot-operations/connect-to-cloud/howto-configure-kafka-endpoint.md

Lines changed: 19 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ ms.author: patricka
66
ms.service: azure-iot-operations
77
ms.subservice: azure-data-flows
88
ms.topic: how-to
9-
ms.date: 11/07/2024
9+
ms.date: 04/03/2025
1010
ai-usage: ai-assisted
1111

1212
#CustomerIntent: As an operator, I want to understand how to configure data flow endpoints for Kafka in Azure IoT Operations so that I can send data to and from Kafka endpoints.
@@ -51,7 +51,7 @@ Then, go to the Event Hubs namespace > **Access control (IAM)** > **Add role ass
5151

5252
Once the Azure Event Hubs namespace and event hub is configured, you can create a data flow endpoint for the Kafka-enabled Azure Event Hubs namespace.
5353

54-
# [Portal](#tab/portal)
54+
# [Operations experience](#tab/portal)
5555

5656
1. In the [operations experience](https://iotoperations.azure.com/), select the **Data flow endpoints** tab.
5757
1. Under **Create new data flow endpoint**, select **Azure Event Hubs** > **New**.
@@ -148,7 +148,7 @@ kubectl apply -f <FILE>.yaml
148148
149149
#### Use connection string for authentication to Event Hubs
150150

151-
# [Portal](#tab/portal)
151+
# [Operations experience](#tab/portal)
152152

153153
> [!IMPORTANT]
154154
> To use the operations experience web UI to manage secrets, Azure IoT Operations must first be enabled with secure settings by configuring an Azure Key Vault and enabling workload identities. To learn more, see [Enable secure settings in Azure IoT Operations deployment](../deploy-iot-ops/howto-enable-secure-settings.md).
@@ -227,7 +227,7 @@ Azure Event Hubs [doesn't support all the compression types that Kafka supports]
227227
228228
To configure a data flow endpoint for non-Event-Hub Kafka brokers, set the host, TLS, authentication, and other settings as needed.
229229
230-
# [Portal](#tab/portal)
230+
# [Operations experience](#tab/portal)
231231
232232
1. In the [operations experience](https://iotoperations.azure.com/), select the **Data flow endpoints** tab.
233233
1. Under **Create new data flow endpoint**, select **Custom Kafka Broker** > **New**.
@@ -322,7 +322,7 @@ Before you configure the data flow endpoint, assign a role to the Azure IoT Oper
322322

323323
Then, configure the data flow endpoint with system-assigned managed identity settings.
324324

325-
# [Portal](#tab/portal)
325+
# [Operations experience](#tab/portal)
326326

327327
In the operations experience data flow endpoint settings page, select the **Basic** tab then choose **Authentication method** > **System assigned managed identity**.
328328

@@ -351,7 +351,7 @@ kafkaSettings:
351351

352352
This configuration creates a managed identity with the default audience, which is the same as the Event Hubs namespace host value in the form of `https://<NAMESPACE>.servicebus.windows.net`. However, if you need to override the default audience, you can set the `audience` field to the desired value.
353353

354-
# [Portal](#tab/portal)
354+
# [Operations experience](#tab/portal)
355355

356356
Not supported in the operations experience.
357357

@@ -392,7 +392,7 @@ Before you configure the data flow endpoint, assign a role to the user-assigned
392392

393393
Then, configure the data flow endpoint with user-assigned managed identity settings.
394394

395-
# [Portal](#tab/portal)
395+
# [Operations experience](#tab/portal)
396396

397397
In the operations experience data flow endpoint settings page, select the **Basic** tab then choose **Authentication method** > **User assigned managed identity**.
398398

@@ -436,7 +436,7 @@ Here, the scope is the audience of the managed identity. The default value is th
436436

437437
To use SASL for authentication, specify the SASL authentication method and configure SASL type and a secret reference with the name of the secret that contains the SASL token.
438438

439-
# [Portal](#tab/portal)
439+
# [Operations experience](#tab/portal)
440440

441441
In the operations experience data flow endpoint settings page, select the **Basic** tab then choose **Authentication method** > **SASL**.
442442

@@ -494,7 +494,7 @@ The secret must be in the same namespace as the Kafka data flow endpoint. The se
494494

495495
To use anonymous authentication, update the authentication section of the Kafka settings to use the Anonymous method.
496496

497-
# [Portal](#tab/portal)
497+
# [Operations experience](#tab/portal)
498498

499499
In the operations experience data flow endpoint settings page, select the **Basic** tab then choose **Authentication method** > **None**.
500500

@@ -524,7 +524,7 @@ kafkaSettings:
524524

525525
You can set advanced settings for the Kafka data flow endpoint such as TLS, trusted CA certificate, Kafka messaging settings, batching, and CloudEvents. You can set these settings in the data flow endpoint **Advanced** portal tab or within the data flow endpoint resource.
526526

527-
# [Portal](#tab/portal)
527+
# [Operations experience](#tab/portal)
528528

529529
In the operations experience, select the **Advanced** tab for the data flow endpoint.
530530

@@ -585,7 +585,7 @@ kafkaSettings:
585585

586586
To enable or disable TLS for the Kafka endpoint, update the `mode` setting in the TLS settings.
587587

588-
# [Portal](#tab/portal)
588+
# [Operations experience](#tab/portal)
589589

590590
In the operations experience data flow endpoint settings page, select the **Advanced** tab then use the checkbox next to **TLS mode enabled**.
591591

@@ -615,7 +615,7 @@ The TLS mode can be set to `Enabled` or `Disabled`. If the mode is set to `Enabl
615615

616616
Configure the trusted CA certificate for the Kafka endpoint to establish a secure connection to the Kafka broker. This setting is important if the Kafka broker uses a self-signed certificate or a certificate signed by a custom CA that isn't trusted by default.
617617

618-
# [Portal](#tab/portal)
618+
# [Operations experience](#tab/portal)
619619

620620
In the operations experience data flow endpoint settings page, select the **Advanced** tab then use the **Trusted CA certificate config map** field to specify the ConfigMap containing the trusted CA certificate.
621621

@@ -655,7 +655,7 @@ The consumer group ID is used to identify the consumer group that the data flow
655655
> [!IMPORTANT]
656656
> When the Kafka endpoint is used as [source](howto-create-dataflow.md#source), the consumer group ID is required. Otherwise, the data flow can't read messages from the Kafka topic, and you get an error "Kafka type source endpoints must have a consumerGroupId defined".
657657

658-
# [Portal](#tab/portal)
658+
# [Operations experience](#tab/portal)
659659

660660
In the operations experience data flow endpoint settings page, select the **Advanced** tab then use the **Consumer group ID** field to specify the consumer group ID.
661661

@@ -692,7 +692,7 @@ The compression field enables compression for the messages sent to Kafka topics.
692692

693693
To configure compression:
694694

695-
# [Portal](#tab/portal)
695+
# [Operations experience](#tab/portal)
696696

697697
In the operations experience data flow endpoint settings page, select the **Advanced** tab then use the **Compression** field to specify the compression type.
698698

@@ -730,7 +730,7 @@ For example, if you set latencyMs to 1000, maxMessages to 100, and maxBytes to 1
730730

731731
To configure batching:
732732

733-
# [Portal](#tab/portal)
733+
# [Operations experience](#tab/portal)
734734

735735
In the operations experience data flow endpoint settings page, select the **Advanced** tab then use the **Batching enabled** field to enable batching. Use the **Batching latency**, **Maximum bytes**, and **Message count** fields to specify the batching settings.
736736

@@ -781,7 +781,7 @@ For example, if you set the partition handling strategy to `Property` and the pa
781781

782782
To configure the partition handling strategy:
783783

784-
# [Portal](#tab/portal)
784+
# [Operations experience](#tab/portal)
785785

786786
In the operations experience data flow endpoint settings page, select the **Advanced** tab then use the **Partition handling strategy** field to specify the partition handling strategy. Use the **Partition key property** field to specify the property used for partitioning if the strategy is set to `Property`.
787787

@@ -821,7 +821,7 @@ For example, if you set the Kafka acknowledgment to `All`, the data flow waits f
821821

822822
To configure the Kafka acknowledgments:
823823

824-
# [Portal](#tab/portal)
824+
# [Operations experience](#tab/portal)
825825

826826
In the operations experience data flow endpoint settings page, select the **Advanced** tab then use the **Kafka acknowledgment** field to specify the Kafka acknowledgment level.
827827

@@ -848,7 +848,7 @@ This setting only takes effect if the endpoint is used as a destination where th
848848

849849
By default, the copy MQTT properties setting is enabled. These user properties include values such as `subject` that stores the name of the asset sending the message.
850850

851-
# [Portal](#tab/portal)
851+
# [Operations experience](#tab/portal)
852852

853853
In the operations experience data flow endpoint settings page, select the **Advanced** tab then use checkbox next to **Copy MQTT properties** field to enable or disable copying MQTT properties.
854854

@@ -994,7 +994,7 @@ Not all event data properties including propertyEventData.correlationId are forw
994994

995995
The `CloudEventAttributes` options are `Propagate` or`CreateOrRemap`.
996996

997-
# [Portal](#tab/portal)
997+
# [Operations experience](#tab/portal)
998998

999999
In the operations experience data flow endpoint settings page, select the **Advanced** tab then use the **Cloud event attributes** field to specify the CloudEvents setting.
10001000

0 commit comments

Comments
 (0)