Skip to content

Commit f3d7ad8

Browse files
Merge pull request #297547 from PatAltimore/patricka-release-2504-aio
Update data flow screenshots
2 parents 37d9f28 + 5c66860 commit f3d7ad8

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

41 files changed

+116
-143
lines changed

articles/iot-operations/connect-to-cloud/howto-configure-adlsv2-endpoint.md

Lines changed: 18 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ ms.author: patricka
66
ms.service: azure-iot-operations
77
ms.subservice: azure-data-flows
88
ms.topic: how-to
9-
ms.date: 11/07/2024
9+
ms.date: 04/01/2025
1010
ai-usage: ai-assisted
1111

1212
#CustomerIntent: As an operator, I want to understand how to configure data flow endpoints for Azure Data Lake Storage Gen2 in Azure IoT Operations so that I can send data to Azure Data Lake Storage Gen2.
@@ -16,32 +16,32 @@ ai-usage: ai-assisted
1616

1717
[!INCLUDE [kubernetes-management-preview-note](../includes/kubernetes-management-preview-note.md)]
1818

19-
To send data to Azure Data Lake Storage Gen2 in Azure IoT Operations, you can configure a data flow endpoint. This configuration allows you to specify the destination endpoint, authentication method, table, and other settings.
19+
Send data to Azure Data Lake Storage Gen2 in Azure IoT Operations by configuring a data flow endpoint. This configuration allows you to specify the destination endpoint, authentication method, table, and other settings.
2020

2121
## Prerequisites
2222

23-
- An instance of [Azure IoT Operations](../deploy-iot-ops/howto-deploy-iot-operations.md)
24-
- An [Azure Data Lake Storage Gen2 account](../../storage/blobs/create-data-lake-storage-account.md)
25-
- A pre-created storage container in the storage account
23+
- An instance of [Azure IoT Operations](../deploy-iot-ops/howto-deploy-iot-operations.md).
24+
- An [Azure Data Lake Storage Gen2 account](../../storage/blobs/create-data-lake-storage-account.md).
25+
- A storage container that is already created in the storage account.
2626

2727
## Assign permission to managed identity
2828

29-
To configure a data flow endpoint for Azure Data Lake Storage Gen2, we recommend using either a user-assigned or system-assigned managed identity. This approach is secure and eliminates the need for managing credentials manually.
29+
To configure a data flow endpoint for Azure Data Lake Storage Gen2, use either a user-assigned or system-assigned managed identity. This approach is secure and removes the need to manage credentials manually.
3030

3131
After the Azure Data Lake Storage Gen2 is created, you need to assign a role to the Azure IoT Operations managed identity that grants permission to write to the storage account.
3232

33-
If using system-assigned managed identity, in Azure portal, go to your Azure IoT Operations instance and select **Overview**. Copy the name of the extension listed after **Azure IoT Operations Arc extension**. For example, *azure-iot-operations-xxxx7*. Your system-assigned managed identity can be found using the same name of the Azure IoT Operations Arc extension.
33+
If you're using a system-assigned managed identity, in the Azure portal, go to your Azure IoT Operations instance and select **Overview**. Copy the name of the extension listed after **Azure IoT Operations Arc extension**. For example, *azure-iot-operations-xxxx7*. Your system-assigned managed identity can be found using the same name of the Azure IoT Operations Arc extension.
3434

3535
Then, go to the Azure Storage account > **Access control (IAM)** > **Add role assignment**.
3636

37-
1. On the **Role** tab select an appropriate role like `Storage Blob Data Contributor`. This gives the managed identity the necessary permissions to write to the Azure Storage blob containers. To learn more, see [Authorize access to blobs using Microsoft Entra ID](../../storage/blobs/authorize-access-azure-active-directory.md).
37+
1. On the **Role** tab, select an appropriate role, such as `Storage Blob Data Contributor`. This gives the managed identity the necessary permissions to write to the Azure Storage blob containers. To learn more, see [Authorize access to blobs using Microsoft Entra ID](../../storage/blobs/authorize-access-azure-active-directory.md).
3838
1. On the **Members** tab:
39-
1. If using system-assigned managed identity, for **Assign access to**, select **User, group, or service principal** option, then select **+ Select members** and search for the name of the Azure IoT Operations Arc extension.
40-
1. If using user-assigned managed identity, for **Assign access to**, select **Managed identity** option, then select **+ Select members** and search for your [user-assigned managed identity set up for cloud connections](../deploy-iot-ops/howto-enable-secure-settings.md#set-up-a-user-assigned-managed-identity-for-cloud-connections).
39+
1. If you're using a system-assigned managed identity, for **Assign access to**, select **User, group, or service principal**, then select **+ Select members** and search for the name of the Azure IoT Operations Arc extension.
40+
1. If you're using a user-assigned managed identity, for **Assign access to**, select **Managed identity**, then select **+ Select members** and search for your [user-assigned managed identity set up for cloud connections](../deploy-iot-ops/howto-enable-secure-settings.md#set-up-a-user-assigned-managed-identity-for-cloud-connections).
4141

4242
## Create data flow endpoint for Azure Data Lake Storage Gen2
4343

44-
# [Portal](#tab/portal)
44+
# [Operations experience](#tab/portal)
4545

4646
1. In the IoT Operations portal, select the **Data flow endpoints** tab.
4747
1. Under **Create new data flow endpoint**, select **Azure Data Lake Storage (2nd generation)** > **New**.
@@ -57,6 +57,7 @@ Then, go to the Azure Storage account > **Access control (IAM)** > **Add role as
5757
| Authentication method | The method used for authentication. We recommend that you choose [*System assigned managed identity*](#system-assigned-managed-identity) or [*User assigned managed identity*](#user-assigned-managed-identity). |
5858
| Client ID | The client ID of the user-assigned managed identity. Required if using *User assigned managed identity*. |
5959
| Tenant ID | The tenant ID of the user-assigned managed identity. Required if using *User assigned managed identity*. |
60+
| Synced secret name | The reference name for the secret in the data flow endpoint settings and Kubernetes cluster. Required if using *Access token*. |
6061
| Access token secret name | The name of the Kubernetes secret containing the SAS token. Required if using *Access token*. |
6162

6263
1. Select **Apply** to provision the endpoint.
@@ -136,7 +137,7 @@ Follow the steps in the [access token](#access-token) section to get a SAS token
136137

137138
Then, create the *DataflowEndpoint* resource and specify the access token authentication method. Here, replace `<SAS_SECRET_NAME>` with name of the secret containing the SAS token and other placeholder values.
138139

139-
# [Portal](#tab/portal)
140+
# [Operations experience](#tab/portal)
140141

141142
See the [access token](#access-token) section for steps to create a secret in the operations experience web UI.
142143

@@ -228,7 +229,7 @@ Before you configure the data flow endpoint, assign a role to the Azure IoT Oper
228229

229230
Then, configure the data flow endpoint with system-assigned managed identity settings.
230231

231-
# [Portal](#tab/portal)
232+
# [Operations experience](#tab/portal)
232233

233234
In the operations experience data flow endpoint settings page, select the **Basic** tab then choose **Authentication method** > **System assigned managed identity**.
234235

@@ -258,7 +259,7 @@ dataLakeStorageSettings:
258259
259260
If you need to override the system-assigned managed identity audience, you can specify the `audience` setting.
260261

261-
# [Portal](#tab/portal)
262+
# [Operations experience](#tab/portal)
262263

263264
In most cases, you don't need to specify a service audience. Not specifying an audience creates a managed identity with the default audience scoped to your storage account.
264265

@@ -299,7 +300,7 @@ Before you configure the data flow endpoint, assign a role to the user-assigned
299300

300301
Then, configure the data flow endpoint with user-assigned managed identity settings.
301302

302-
# [Portal](#tab/portal)
303+
# [Operations experience](#tab/portal)
303304

304305
In the operations experience data flow endpoint settings page, select the **Basic** tab then choose **Authentication method** > **User assigned managed identity**.
305306

@@ -352,7 +353,7 @@ Get a [SAS token](../../storage/common/storage-sas-overview.md) for an Azure Dat
352353

353354
To enhance security and follow the principle of least privilege, you can generate a SAS token for a specific container. To prevent authentication errors, ensure that the container specified in the SAS token matches the data flow destination setting in the configuration.
354355

355-
# [Portal](#tab/portal)
356+
# [Operations experience](#tab/portal)
356357

357358
> [!IMPORTANT]
358359
> To use the operations experience web UI to manage secrets, Azure IoT Operations must first be enabled with secure settings by configuring an Azure Key Vault and enabling workload identities. To learn more, see [Enable secure settings in Azure IoT Operations deployment](../deploy-iot-ops/howto-enable-secure-settings.md).
@@ -419,7 +420,7 @@ Use the `batching` settings to configure the maximum number of messages and the
419420

420421
For example, to configure the maximum number of messages to 1000 and the maximum latency to 100 seconds, use the following settings:
421422

422-
# [Portal](#tab/portal)
423+
# [Operations experience](#tab/portal)
423424

424425
In the operations experience, select the **Advanced** tab for the data flow endpoint.
425426

articles/iot-operations/connect-to-cloud/howto-configure-adx-endpoint.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ ms.author: patricka
66
ms.service: azure-iot-operations
77
ms.subservice: azure-data-flows
88
ms.topic: how-to
9-
ms.date: 11/04/2024
9+
ms.date: 04/03/2025
1010
ai-usage: ai-assisted
1111

1212
#CustomerIntent: As an operator, I want to understand how to configure data flow endpoints for Azure Data Explorer in Azure IoT Operations so that I can send data to Azure Data Explorer.
@@ -65,7 +65,7 @@ If using system-assigned managed identity, in Azure portal, go to your Azure IoT
6565
6666
<!-- TODO: use the data ingest URI for host? -->
6767
68-
# [Portal](#tab/portal)
68+
# [Operations experience](#tab/portal)
6969
7070
1. In the operations experience, select the **Data flow endpoints** tab.
7171
1. Under **Create new data flow endpoint**, select **Azure Data Explorer** > **New**.
@@ -172,7 +172,7 @@ Before you configure the data flow endpoint, assign a role to the Azure IoT Oper
172172

173173
Then, configure the data flow endpoint with system-assigned managed identity settings.
174174

175-
# [Portal](#tab/portal)
175+
# [Operations experience](#tab/portal)
176176

177177
In the operations experience data flow endpoint settings page, select the **Basic** tab then choose **Authentication method** > **System assigned managed identity**.
178178

@@ -201,7 +201,7 @@ dataExplorerSettings:
201201
If you need to override the system-assigned managed identity audience, you can specify the `audience` setting.
202202

203203

204-
# [Portal](#tab/portal)
204+
# [Operations experience](#tab/portal)
205205

206206
In most cases, you don't need to specify other settings. This configuration creates a managed identity with the default audience `https://api.kusto.windows.net`.
207207

@@ -242,7 +242,7 @@ Before you configure the data flow endpoint, assign a role to the user-assigned
242242

243243
Then, configure the data flow endpoint with user-assigned managed identity settings.
244244

245-
# [Portal](#tab/portal)
245+
# [Operations experience](#tab/portal)
246246

247247
In the operations experience data flow endpoint settings page, select the **Basic** tab then choose **Authentication method** > **User assigned managed identity**.
248248

@@ -294,7 +294,7 @@ Use the `batching` settings to configure the maximum number of messages and the
294294

295295
For example, to configure the maximum number of messages to 1000 and the maximum latency to 100 seconds, use the following settings:
296296

297-
# [Portal](#tab/portal)
297+
# [Operations experience](#tab/portal)
298298

299299
In the operations experience, select the **Advanced** tab for the data flow endpoint.
300300

articles/iot-operations/connect-to-cloud/howto-configure-dataflow-endpoint.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ ms.author: patricka
66
ms.service: azure-iot-operations
77
ms.subservice: azure-data-flows
88
ms.topic: how-to
9-
ms.date: 11/01/2024
9+
ms.date: 04/03/2025
1010

1111
#CustomerIntent: As an operator, I want to understand how to configure source and destination endpoints so that I can create a data flow.
1212
---
@@ -59,7 +59,7 @@ To make it easier to reuse endpoints, the MQTT or Kafka topic filter isn't part
5959

6060
For example, you can use the default MQTT broker data flow endpoint. You can use it for both the source and destination with different topic filters:
6161

62-
# [Portal](#tab/portal)
62+
# [Operations experience](#tab/portal)
6363

6464
:::image type="content" source="media/howto-configure-dataflow-endpoint/create-dataflow-mq-mq.png" alt-text="Screenshot using operations experience to create a data flow from MQTT to MQTT.":::
6565

@@ -123,7 +123,7 @@ spec:
123123
124124
Similarly, you can create multiple data flows that use the same MQTT endpoint for other endpoints and topics. For example, you can use the same MQTT endpoint for a data flow that sends data to an Event Hubs endpoint.
125125
126-
# [Portal](#tab/portal)
126+
# [Operations experience](#tab/portal)
127127
128128
:::image type="content" source="media/howto-configure-dataflow-endpoint/create-dataflow-mq-kafka.png" alt-text="Screenshot using operations experience to create a data flow from MQTT to Kafka.":::
129129

articles/iot-operations/connect-to-cloud/howto-configure-fabric-endpoint.md

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ ms.author: patricka
66
ms.service: azure-iot-operations
77
ms.subservice: azure-data-flows
88
ms.topic: how-to
9-
ms.date: 11/11/2024
9+
ms.date: 04/03/2025
1010
ai-usage: ai-assisted
1111

1212
#CustomerIntent: As an operator, I want to understand how to configure data flow endpoints for Microsoft Fabric OneLake in Azure IoT Operations so that I can send data to Microsoft Fabric OneLake.
@@ -43,7 +43,7 @@ Go to Microsoft Fabric workspace you created, select **Manage access** > **+ Add
4343

4444
## Create data flow endpoint for Microsoft Fabric OneLake
4545

46-
# [Portal](#tab/portal)
46+
# [Operations experience](#tab/portal)
4747

4848
1. In the operations experience, select the **Data flow endpoints** tab.
4949
1. Under **Create new data flow endpoint**, select **Microsoft Fabric OneLake** > **New**.
@@ -149,7 +149,7 @@ kubectl apply -f <FILE>.yaml
149149

150150
The `oneLakePathType` setting determines the type of path to use in the OneLake path. The default value is `Tables`, which is the recommended path type for the most common use cases. The `Tables` path type is a table in the OneLake lakehouse that is used to store the data. It can also be set as `Files`, which is a file in the OneLake lakehouse that is used to store the data. The `Files` path type is useful when you want to store the data in a file format that isn't supported by the `Tables` path type.
151151

152-
# [Portal](#tab/portal)
152+
# [Operations experience](#tab/portal)
153153

154154
The OneLake path type is set in the **Basic** tab for the data flow endpoint.
155155

@@ -186,7 +186,7 @@ Before you configure the data flow endpoint, assign a role to the Azure IoT Oper
186186
187187
Then, configure the data flow endpoint with system-assigned managed identity settings.
188188
189-
# [Portal](#tab/portal)
189+
# [Operations experience](#tab/portal)
190190
191191
In the operations experience data flow endpoint settings page, select the **Basic** tab then choose **Authentication method** > **System assigned managed identity**.
192192
@@ -215,7 +215,7 @@ fabricOneLakeSettings:
215215
216216
If you need to override the system-assigned managed identity audience, you can specify the `audience` setting.
217217

218-
# [Portal](#tab/portal)
218+
# [Operations experience](#tab/portal)
219219

220220
In most cases, you don't need to specify a service audience. Not specifying an audience creates a managed identity with the default audience scoped to your storage account.
221221

@@ -256,7 +256,7 @@ Before you configure the data flow endpoint, assign a role to the user-assigned
256256

257257
Then, configure the data flow endpoint with user-assigned managed identity settings.
258258

259-
# [Portal](#tab/portal)
259+
# [Operations experience](#tab/portal)
260260

261261
In the operations experience data flow endpoint settings page, select the **Basic** tab then choose **Authentication method** > **User assigned managed identity**.
262262

@@ -312,7 +312,7 @@ Use the `batching` settings to configure the maximum number of messages and the
312312

313313
For example, to configure the maximum number of messages to 1000 and the maximum latency to 100 seconds, use the following settings:
314314

315-
# [Portal](#tab/portal)
315+
# [Operations experience](#tab/portal)
316316

317317
In the operations experience, select the **Advanced** tab for the data flow endpoint.
318318

articles/iot-operations/connect-to-cloud/howto-configure-fabric-real-time-intelligence.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ ms.author: patricka
66
ms.service: azure-iot-operations
77
ms.subservice: azure-data-flows
88
ms.topic: how-to
9-
ms.date: 03/20/2025
9+
ms.date: 04/03/2025
1010
ai-usage: ai-assisted
1111

1212
#CustomerIntent: As an operator, I want to understand how to configure data flow endpoints for Microsoft Fabric Real-Time Intelligence in Azure IoT Operations so that I can send real-time data to Microsoft Fabric.
@@ -50,7 +50,7 @@ To configure a data flow endpoint for Microsoft Fabric Real-Time Intelligence, y
5050

5151
Azure Key Vault is the recommended way to sync the connection string to the Kubernetes cluster so that it can be referenced in the data flow. [Secure settings](../deploy-iot-ops/howto-enable-secure-settings.md) must be enabled to configure this endpoint using the operations experience web UI.
5252

53-
# [Portal](#tab/portal)
53+
# [Operations experience](#tab/portal)
5454

5555
1. In the IoT Operations experience portal, select the **Data flow endpoints** tab.
5656
1. Under **Create new data flow endpoint**, select **Microsoft Fabric Real-Time Intelligence** > **New**.

0 commit comments

Comments
 (0)