Skip to content

Commit b3a7715

Browse files
authored
Merge pull request #298034 from MicrosoftDocs/release-2504-aio
Preview AIO 2504 release - 04/29/2025 - 10 AM PST
2 parents df3eeb3 + 8a8ee5e commit b3a7715

File tree

89 files changed

+6273
-25654
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

89 files changed

+6273
-25654
lines changed

articles/iot-operations/.openpublishing.redirection.iot-operations.json

Lines changed: 35 additions & 35 deletions
Original file line numberDiff line numberDiff line change
@@ -135,41 +135,6 @@
135135
"redirect_url": "https://github.com/Azure-Samples/iot-edge-opc-plc/blob/main/README.md",
136136
"redirect_document_id": false
137137
},
138-
{
139-
"source_path_from_root": "/articles/iot-operations/manage-devices-assets/howto-autodetect-opcua-assets-using-akri.md",
140-
"redirect_url": "/azure/iot-operations/discover-manage-assets/overview-manage-assets",
141-
"redirect_document_id": false
142-
},
143-
{
144-
"source_path_from_root": "/articles/iot-operations/manage-devices-assets/concept-akri-architecture.md",
145-
"redirect_url": "/azure/iot-operations/discover-manage-assets/overview-manage-assets",
146-
"redirect_document_id": false
147-
},
148-
{
149-
"source_path_from_root": "/articles/iot-operations/manage-devices-assets/overview-akri.md",
150-
"redirect_url": "/azure/iot-operations/discover-manage-assets/overview-manage-assets",
151-
"redirect_document_id": false
152-
},
153-
{
154-
"source_path_from_root": "/articles/iot-operations/discover-manage-assets/concept-akri-architecture.md",
155-
"redirect_url": "/azure/iot-operations/discover-manage-assets/overview-manage-assets",
156-
"redirect_document_id": false
157-
},
158-
{
159-
"source_path_from_root": "/articles/iot-operations/discover-manage-assets/howto-autodetect-opcua-assets-using-akri.md",
160-
"redirect_url": "/azure/iot-operations/discover-manage-assets/overview-manage-assets",
161-
"redirect_document_id": false
162-
},
163-
{
164-
"source_path_from_root": "/articles/iot-operations/discover-manage-assets/overview-akri.md",
165-
"redirect_url": "/azure/iot-operations/discover-manage-assets/overview-manage-assets",
166-
"redirect_document_id": false
167-
},
168-
{
169-
"source_path_from_root": "/articles/iot-operations/reference/observability-metrics-akri.md",
170-
"redirect_url": "/azure/iot-operations/reference/observability-metrics-opcua-broker",
171-
"redirect_document_id": false
172-
},
173138
{
174139
"source_path_from_root": "/articles/iot-operations/configure-observability-monitoring/howto-add-cluster.md",
175140
"redirect_url": "/azure/iot-operations/configure-observability-monitoring/howto-configure-observability",
@@ -544,6 +509,41 @@
544509
"source_path_from_root": "/articles/iot-operations/view-analyze-telemetry/tutorial-real-time-dashboard-fabric.md",
545510
"redirect_url": "/azure/iot-operations/end-to-end-tutorials/tutorial-add-assets",
546511
"redirect_document_id": false
512+
},
513+
{
514+
"source_path_from_root": "/articles/iot-operations/discover-manage-assets/howto-secure-assets.md",
515+
"redirect_url": "/azure/iot-operations/reference/custom-rbac",
516+
"redirect_document_id": true
517+
},
518+
{
519+
"source_path_from_root": "/articles/iot-operations/manage-devices-assets/howto-autodetect-opcua-assets-using-akri.md",
520+
"redirect_url": "/azure/iot-operations/discover-manage-assets/howto-autodetect-opc-ua-assets-use-akri",
521+
"redirect_document_id": false
522+
},
523+
{
524+
"source_path_from_root": "/articles/iot-operations/discover-manage-assets/howto-autodetect-opcua-assets-using-akri.md",
525+
"redirect_url": "/azure/iot-operations/discover-manage-assets/howto-autodetect-opc-ua-assets-use-akri",
526+
"redirect_document_id": false
527+
},
528+
{
529+
"source_path_from_root": "/articles/iot-operations/discover-manage-assets/concept-akri-architecture.md",
530+
"redirect_url": "/azure/iot-operations/discover-manage-assets/overview-akri",
531+
"redirect_document_id": false
532+
},
533+
{
534+
"source_path_from_root": "/articles/iot-operations/manage-devices-assets/concept-akri-architecture.md",
535+
"redirect_url": "/azure/iot-operations/discover-manage-assets/overview-akri",
536+
"redirect_document_id": false
537+
},
538+
{
539+
"source_path_from_root": "/articles/iot-operations/manage-devices-assets/overview-akri.md",
540+
"redirect_url": "/azure/iot-operations/discover-manage-assets/overview-akri",
541+
"redirect_document_id": false
542+
},
543+
{
544+
"source_path_from_root": "/articles/iot-operations/reference/observability-metrics-akri.md",
545+
"redirect_url": "/azure/iot-operations/reference/observability-metrics-opcua-broker",
546+
"redirect_document_id": false
547547
}
548548
]
549549
}

articles/iot-operations/connect-to-cloud/howto-configure-adlsv2-endpoint.md

Lines changed: 18 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ ms.author: patricka
66
ms.service: azure-iot-operations
77
ms.subservice: azure-data-flows
88
ms.topic: how-to
9-
ms.date: 11/07/2024
9+
ms.date: 04/01/2025
1010
ai-usage: ai-assisted
1111

1212
#CustomerIntent: As an operator, I want to understand how to configure data flow endpoints for Azure Data Lake Storage Gen2 in Azure IoT Operations so that I can send data to Azure Data Lake Storage Gen2.
@@ -16,32 +16,32 @@ ai-usage: ai-assisted
1616

1717
[!INCLUDE [kubernetes-management-preview-note](../includes/kubernetes-management-preview-note.md)]
1818

19-
To send data to Azure Data Lake Storage Gen2 in Azure IoT Operations, you can configure a data flow endpoint. This configuration allows you to specify the destination endpoint, authentication method, table, and other settings.
19+
Send data to Azure Data Lake Storage Gen2 in Azure IoT Operations by configuring a data flow endpoint. This configuration allows you to specify the destination endpoint, authentication method, table, and other settings.
2020

2121
## Prerequisites
2222

23-
- An instance of [Azure IoT Operations](../deploy-iot-ops/howto-deploy-iot-operations.md)
24-
- An [Azure Data Lake Storage Gen2 account](../../storage/blobs/create-data-lake-storage-account.md)
25-
- A pre-created storage container in the storage account
23+
- An instance of [Azure IoT Operations](../deploy-iot-ops/howto-deploy-iot-operations.md).
24+
- An [Azure Data Lake Storage Gen2 account](../../storage/blobs/create-data-lake-storage-account.md).
25+
- A storage container that is already created in the storage account.
2626

2727
## Assign permission to managed identity
2828

29-
To configure a data flow endpoint for Azure Data Lake Storage Gen2, we recommend using either a user-assigned or system-assigned managed identity. This approach is secure and eliminates the need for managing credentials manually.
29+
To configure a data flow endpoint for Azure Data Lake Storage Gen2, use either a user-assigned or system-assigned managed identity. This approach is secure and removes the need to manage credentials manually.
3030

3131
After the Azure Data Lake Storage Gen2 is created, you need to assign a role to the Azure IoT Operations managed identity that grants permission to write to the storage account.
3232

33-
If using system-assigned managed identity, in Azure portal, go to your Azure IoT Operations instance and select **Overview**. Copy the name of the extension listed after **Azure IoT Operations Arc extension**. For example, *azure-iot-operations-xxxx7*. Your system-assigned managed identity can be found using the same name of the Azure IoT Operations Arc extension.
33+
If you're using a system-assigned managed identity, in the Azure portal, go to your Azure IoT Operations instance and select **Overview**. Copy the name of the extension listed after **Azure IoT Operations Arc extension**. For example, *azure-iot-operations-xxxx7*. Your system-assigned managed identity can be found using the same name of the Azure IoT Operations Arc extension.
3434

3535
Then, go to the Azure Storage account > **Access control (IAM)** > **Add role assignment**.
3636

37-
1. On the **Role** tab select an appropriate role like `Storage Blob Data Contributor`. This gives the managed identity the necessary permissions to write to the Azure Storage blob containers. To learn more, see [Authorize access to blobs using Microsoft Entra ID](../../storage/blobs/authorize-access-azure-active-directory.md).
37+
1. On the **Role** tab, select an appropriate role, such as `Storage Blob Data Contributor`. This gives the managed identity the necessary permissions to write to the Azure Storage blob containers. To learn more, see [Authorize access to blobs using Microsoft Entra ID](../../storage/blobs/authorize-access-azure-active-directory.md).
3838
1. On the **Members** tab:
39-
1. If using system-assigned managed identity, for **Assign access to**, select **User, group, or service principal** option, then select **+ Select members** and search for the name of the Azure IoT Operations Arc extension.
40-
1. If using user-assigned managed identity, for **Assign access to**, select **Managed identity** option, then select **+ Select members** and search for your [user-assigned managed identity set up for cloud connections](../deploy-iot-ops/howto-enable-secure-settings.md#set-up-a-user-assigned-managed-identity-for-cloud-connections).
39+
1. If you're using a system-assigned managed identity, for **Assign access to**, select **User, group, or service principal**, then select **+ Select members** and search for the name of the Azure IoT Operations Arc extension.
40+
1. If you're using a user-assigned managed identity, for **Assign access to**, select **Managed identity**, then select **+ Select members** and search for your [user-assigned managed identity set up for cloud connections](../deploy-iot-ops/howto-enable-secure-settings.md#set-up-a-user-assigned-managed-identity-for-cloud-connections).
4141

4242
## Create data flow endpoint for Azure Data Lake Storage Gen2
4343

44-
# [Portal](#tab/portal)
44+
# [Operations experience](#tab/portal)
4545

4646
1. In the IoT Operations portal, select the **Data flow endpoints** tab.
4747
1. Under **Create new data flow endpoint**, select **Azure Data Lake Storage (2nd generation)** > **New**.
@@ -57,6 +57,7 @@ Then, go to the Azure Storage account > **Access control (IAM)** > **Add role as
5757
| Authentication method | The method used for authentication. We recommend that you choose [*System assigned managed identity*](#system-assigned-managed-identity) or [*User assigned managed identity*](#user-assigned-managed-identity). |
5858
| Client ID | The client ID of the user-assigned managed identity. Required if using *User assigned managed identity*. |
5959
| Tenant ID | The tenant ID of the user-assigned managed identity. Required if using *User assigned managed identity*. |
60+
| Synced secret name | The reference name for the secret in the data flow endpoint settings and Kubernetes cluster. Required if using *Access token*. |
6061
| Access token secret name | The name of the Kubernetes secret containing the SAS token. Required if using *Access token*. |
6162

6263
1. Select **Apply** to provision the endpoint.
@@ -136,7 +137,7 @@ Follow the steps in the [access token](#access-token) section to get a SAS token
136137

137138
Then, create the *DataflowEndpoint* resource and specify the access token authentication method. Here, replace `<SAS_SECRET_NAME>` with name of the secret containing the SAS token and other placeholder values.
138139

139-
# [Portal](#tab/portal)
140+
# [Operations experience](#tab/portal)
140141

141142
See the [access token](#access-token) section for steps to create a secret in the operations experience web UI.
142143

@@ -228,7 +229,7 @@ Before you configure the data flow endpoint, assign a role to the Azure IoT Oper
228229

229230
Then, configure the data flow endpoint with system-assigned managed identity settings.
230231

231-
# [Portal](#tab/portal)
232+
# [Operations experience](#tab/portal)
232233

233234
In the operations experience data flow endpoint settings page, select the **Basic** tab then choose **Authentication method** > **System assigned managed identity**.
234235

@@ -258,7 +259,7 @@ dataLakeStorageSettings:
258259
259260
If you need to override the system-assigned managed identity audience, you can specify the `audience` setting.
260261

261-
# [Portal](#tab/portal)
262+
# [Operations experience](#tab/portal)
262263

263264
In most cases, you don't need to specify a service audience. Not specifying an audience creates a managed identity with the default audience scoped to your storage account.
264265

@@ -299,7 +300,7 @@ Before you configure the data flow endpoint, assign a role to the user-assigned
299300

300301
Then, configure the data flow endpoint with user-assigned managed identity settings.
301302

302-
# [Portal](#tab/portal)
303+
# [Operations experience](#tab/portal)
303304

304305
In the operations experience data flow endpoint settings page, select the **Basic** tab then choose **Authentication method** > **User assigned managed identity**.
305306

@@ -352,7 +353,7 @@ Get a [SAS token](../../storage/common/storage-sas-overview.md) for an Azure Dat
352353

353354
To enhance security and follow the principle of least privilege, you can generate a SAS token for a specific container. To prevent authentication errors, ensure that the container specified in the SAS token matches the data flow destination setting in the configuration.
354355

355-
# [Portal](#tab/portal)
356+
# [Operations experience](#tab/portal)
356357

357358
> [!IMPORTANT]
358359
> To use the operations experience web UI to manage secrets, Azure IoT Operations must first be enabled with secure settings by configuring an Azure Key Vault and enabling workload identities. To learn more, see [Enable secure settings in Azure IoT Operations deployment](../deploy-iot-ops/howto-enable-secure-settings.md).
@@ -419,7 +420,7 @@ Use the `batching` settings to configure the maximum number of messages and the
419420

420421
For example, to configure the maximum number of messages to 1000 and the maximum latency to 100 seconds, use the following settings:
421422

422-
# [Portal](#tab/portal)
423+
# [Operations experience](#tab/portal)
423424

424425
In the operations experience, select the **Advanced** tab for the data flow endpoint.
425426

articles/iot-operations/connect-to-cloud/howto-configure-adx-endpoint.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ ms.author: patricka
66
ms.service: azure-iot-operations
77
ms.subservice: azure-data-flows
88
ms.topic: how-to
9-
ms.date: 11/04/2024
9+
ms.date: 04/03/2025
1010
ai-usage: ai-assisted
1111

1212
#CustomerIntent: As an operator, I want to understand how to configure data flow endpoints for Azure Data Explorer in Azure IoT Operations so that I can send data to Azure Data Explorer.
@@ -65,7 +65,7 @@ If using system-assigned managed identity, in Azure portal, go to your Azure IoT
6565
6666
<!-- TODO: use the data ingest URI for host? -->
6767
68-
# [Portal](#tab/portal)
68+
# [Operations experience](#tab/portal)
6969
7070
1. In the operations experience, select the **Data flow endpoints** tab.
7171
1. Under **Create new data flow endpoint**, select **Azure Data Explorer** > **New**.
@@ -172,7 +172,7 @@ Before you configure the data flow endpoint, assign a role to the Azure IoT Oper
172172

173173
Then, configure the data flow endpoint with system-assigned managed identity settings.
174174

175-
# [Portal](#tab/portal)
175+
# [Operations experience](#tab/portal)
176176

177177
In the operations experience data flow endpoint settings page, select the **Basic** tab then choose **Authentication method** > **System assigned managed identity**.
178178

@@ -201,7 +201,7 @@ dataExplorerSettings:
201201
If you need to override the system-assigned managed identity audience, you can specify the `audience` setting.
202202

203203

204-
# [Portal](#tab/portal)
204+
# [Operations experience](#tab/portal)
205205

206206
In most cases, you don't need to specify other settings. This configuration creates a managed identity with the default audience `https://api.kusto.windows.net`.
207207

@@ -242,7 +242,7 @@ Before you configure the data flow endpoint, assign a role to the user-assigned
242242

243243
Then, configure the data flow endpoint with user-assigned managed identity settings.
244244

245-
# [Portal](#tab/portal)
245+
# [Operations experience](#tab/portal)
246246

247247
In the operations experience data flow endpoint settings page, select the **Basic** tab then choose **Authentication method** > **User assigned managed identity**.
248248

@@ -294,7 +294,7 @@ Use the `batching` settings to configure the maximum number of messages and the
294294

295295
For example, to configure the maximum number of messages to 1000 and the maximum latency to 100 seconds, use the following settings:
296296

297-
# [Portal](#tab/portal)
297+
# [Operations experience](#tab/portal)
298298

299299
In the operations experience, select the **Advanced** tab for the data flow endpoint.
300300

articles/iot-operations/connect-to-cloud/howto-configure-dataflow-endpoint.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ ms.author: patricka
66
ms.service: azure-iot-operations
77
ms.subservice: azure-data-flows
88
ms.topic: how-to
9-
ms.date: 11/01/2024
9+
ms.date: 04/03/2025
1010

1111
#CustomerIntent: As an operator, I want to understand how to configure source and destination endpoints so that I can create a data flow.
1212
---
@@ -59,7 +59,7 @@ To make it easier to reuse endpoints, the MQTT or Kafka topic filter isn't part
5959

6060
For example, you can use the default MQTT broker data flow endpoint. You can use it for both the source and destination with different topic filters:
6161

62-
# [Portal](#tab/portal)
62+
# [Operations experience](#tab/portal)
6363

6464
:::image type="content" source="media/howto-configure-dataflow-endpoint/create-dataflow-mq-mq.png" alt-text="Screenshot using operations experience to create a data flow from MQTT to MQTT.":::
6565

@@ -123,7 +123,7 @@ spec:
123123
124124
Similarly, you can create multiple data flows that use the same MQTT endpoint for other endpoints and topics. For example, you can use the same MQTT endpoint for a data flow that sends data to an Event Hubs endpoint.
125125
126-
# [Portal](#tab/portal)
126+
# [Operations experience](#tab/portal)
127127
128128
:::image type="content" source="media/howto-configure-dataflow-endpoint/create-dataflow-mq-kafka.png" alt-text="Screenshot using operations experience to create a data flow from MQTT to Kafka.":::
129129

0 commit comments

Comments
 (0)