You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/app-service/app-service-hybrid-connections.md
+10-3Lines changed: 10 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,7 +4,7 @@ description: Learn how to create and use hybrid connections in Azure App Service
4
4
author: madsd
5
5
ms.assetid: 66774bde-13f5-45d0-9a70-4e9536a4f619
6
6
ms.topic: article
7
-
ms.date: 04/10/2025
7
+
ms.date: 04/29/2025
8
8
ms.author: madsd
9
9
ms.custom: "UpdateFrequency3, fasttrack-edit"
10
10
#customer intent: As an app developer, I want to understand the usage of Hybrid Connections to provide access to apps in Azure App Service.
@@ -237,10 +237,10 @@ The status of **Connected** means that at least one HCM is configured with that
237
237
238
238
:::image type="content" source="media/app-service-hybrid-connections/hybrid-connections-service-bus-endpoint.png" alt-text="Screenshot of Hybrid Connection Service Bus endpoint.":::
239
239
240
-
- The Service Bus gateways are the resources that accept the request into the Hybrid Connection and pass it through the Azure Relay. You need to allowlist all 128 of the gateways. The gateways are in the format: `G#-prod-[stamp]-sb.servicebus.windows.net`. The number sign, `#`, is a number between 0 and 127 and `stamp` is the name of the instance within your Azure data center where your Service Bus endpoint exists.
240
+
- The Service Bus gateways are the resources that accept the request into the Hybrid Connection and pass it through the Azure Relay. You need to allowlist all of the gateways. The gateways are in the format: `G#-prod-[stamp]-sb.servicebus.windows.net` and `GV#-prod-[stamp]-sb.servicebus.windows.net`. The number sign, `#`, is a number between 0 and 127 and `stamp` is the name of the instance within your Azure data center where your Service Bus endpoint exists.
241
241
242
242
- If you can use a wildcard, you can allowlist *\*.servicebus.windows.net*.
243
-
- If you can't use a wildcard, you must allowlist all 128 gateways.
243
+
- If you can't use a wildcard, you must allowlist all 256 of the gateways.
244
244
245
245
You can find out the stamp using *nslookup* on the Service Bus endpoint URL.
246
246
@@ -255,6 +255,13 @@ The status of **Connected** means that at least one HCM is configured with that
255
255
...
256
256
G126-prod-sn3-010-sb.servicebus.windows.net
257
257
G127-prod-sn3-010-sb.servicebus.windows.net
258
+
GV0-prod-sn3-010-sb.servicebus.windows.net
259
+
GV1-prod-sn3-010-sb.servicebus.windows.net
260
+
GV2-prod-sn3-010-sb.servicebus.windows.net
261
+
GV3-prod-sn3-010-sb.servicebus.windows.net
262
+
...
263
+
GV126-prod-sn3-010-sb.servicebus.windows.net
264
+
GV127-prod-sn3-010-sb.servicebus.windows.net
258
265
259
266
If your status says **Connected** but your app can't reach your endpoint then:
Copy file name to clipboardExpand all lines: articles/iot-operations/connect-to-cloud/howto-configure-adlsv2-endpoint.md
+18-17Lines changed: 18 additions & 17 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,7 +6,7 @@ ms.author: patricka
6
6
ms.service: azure-iot-operations
7
7
ms.subservice: azure-data-flows
8
8
ms.topic: how-to
9
-
ms.date: 11/07/2024
9
+
ms.date: 04/01/2025
10
10
ai-usage: ai-assisted
11
11
12
12
#CustomerIntent: As an operator, I want to understand how to configure data flow endpoints for Azure Data Lake Storage Gen2 in Azure IoT Operations so that I can send data to Azure Data Lake Storage Gen2.
To send data to Azure Data Lake Storage Gen2 in Azure IoT Operations, you can configure a data flow endpoint. This configuration allows you to specify the destination endpoint, authentication method, table, and other settings.
19
+
Send data to Azure Data Lake Storage Gen2 in Azure IoT Operations by configuring a data flow endpoint. This configuration allows you to specify the destination endpoint, authentication method, table, and other settings.
20
20
21
21
## Prerequisites
22
22
23
-
- An instance of [Azure IoT Operations](../deploy-iot-ops/howto-deploy-iot-operations.md)
24
-
- An [Azure Data Lake Storage Gen2 account](../../storage/blobs/create-data-lake-storage-account.md)
25
-
- A pre-created storage container in the storage account
23
+
- An instance of [Azure IoT Operations](../deploy-iot-ops/howto-deploy-iot-operations.md).
24
+
- An [Azure Data Lake Storage Gen2 account](../../storage/blobs/create-data-lake-storage-account.md).
25
+
- A storage container that is already created in the storage account.
26
26
27
27
## Assign permission to managed identity
28
28
29
-
To configure a data flow endpoint for Azure Data Lake Storage Gen2, we recommend using either a user-assigned or system-assigned managed identity. This approach is secure and eliminates the need for managing credentials manually.
29
+
To configure a data flow endpoint for Azure Data Lake Storage Gen2, use either a user-assigned or system-assigned managed identity. This approach is secure and removes the need to manage credentials manually.
30
30
31
31
After the Azure Data Lake Storage Gen2 is created, you need to assign a role to the Azure IoT Operations managed identity that grants permission to write to the storage account.
32
32
33
-
If using system-assigned managed identity, in Azure portal, go to your Azure IoT Operations instance and select **Overview**. Copy the name of the extension listed after **Azure IoT Operations Arc extension**. For example, *azure-iot-operations-xxxx7*. Your system-assigned managed identity can be found using the same name of the Azure IoT Operations Arc extension.
33
+
If you're using a system-assigned managed identity, in the Azure portal, go to your Azure IoT Operations instance and select **Overview**. Copy the name of the extension listed after **Azure IoT Operations Arc extension**. For example, *azure-iot-operations-xxxx7*. Your system-assigned managed identity can be found using the same name of the Azure IoT Operations Arc extension.
34
34
35
35
Then, go to the Azure Storage account > **Access control (IAM)** > **Add role assignment**.
36
36
37
-
1. On the **Role** tab select an appropriate role like`Storage Blob Data Contributor`. This gives the managed identity the necessary permissions to write to the Azure Storage blob containers. To learn more, see [Authorize access to blobs using Microsoft Entra ID](../../storage/blobs/authorize-access-azure-active-directory.md).
37
+
1. On the **Role** tab, select an appropriate role, such as`Storage Blob Data Contributor`. This gives the managed identity the necessary permissions to write to the Azure Storage blob containers. To learn more, see [Authorize access to blobs using Microsoft Entra ID](../../storage/blobs/authorize-access-azure-active-directory.md).
38
38
1. On the **Members** tab:
39
-
1. If using system-assigned managed identity, for **Assign access to**, select **User, group, or service principal** option, then select **+ Select members** and search for the name of the Azure IoT Operations Arc extension.
40
-
1. If using user-assigned managed identity, for **Assign access to**, select **Managed identity** option, then select **+ Select members** and search for your [user-assigned managed identity set up for cloud connections](../deploy-iot-ops/howto-enable-secure-settings.md#set-up-a-user-assigned-managed-identity-for-cloud-connections).
39
+
1. If you're using a system-assigned managed identity, for **Assign access to**, select **User, group, or service principal**, then select **+ Select members** and search for the name of the Azure IoT Operations Arc extension.
40
+
1. If you're using a user-assigned managed identity, for **Assign access to**, select **Managed identity**, then select **+ Select members** and search for your [user-assigned managed identity set up for cloud connections](../deploy-iot-ops/howto-enable-secure-settings.md#set-up-a-user-assigned-managed-identity-for-cloud-connections).
41
41
42
42
## Create data flow endpoint for Azure Data Lake Storage Gen2
43
43
44
-
# [Portal](#tab/portal)
44
+
# [Operations experience](#tab/portal)
45
45
46
46
1. In the IoT Operations portal, select the **Data flow endpoints** tab.
47
47
1. Under **Create new data flow endpoint**, select **Azure Data Lake Storage (2nd generation)** > **New**.
@@ -57,6 +57,7 @@ Then, go to the Azure Storage account > **Access control (IAM)** > **Add role as
57
57
| Authentication method | The method used for authentication. We recommend that you choose [*System assigned managed identity*](#system-assigned-managed-identity) or [*User assigned managed identity*](#user-assigned-managed-identity). |
58
58
| Client ID | The client ID of the user-assigned managed identity. Required if using *User assigned managed identity*. |
59
59
| Tenant ID | The tenant ID of the user-assigned managed identity. Required if using *User assigned managed identity*. |
60
+
| Synced secret name | The reference name for the secret in the data flow endpoint settings and Kubernetes cluster. Required if using *Access token*. |
60
61
| Access token secret name | The name of the Kubernetes secret containing the SAS token. Required if using *Access token*. |
61
62
62
63
1. Select **Apply** to provision the endpoint.
@@ -136,7 +137,7 @@ Follow the steps in the [access token](#access-token) section to get a SAS token
136
137
137
138
Then, create the *DataflowEndpoint* resource and specify the access token authentication method. Here, replace `<SAS_SECRET_NAME>` with name of the secret containing the SAS token and other placeholder values.
138
139
139
-
# [Portal](#tab/portal)
140
+
# [Operations experience](#tab/portal)
140
141
141
142
See the [access token](#access-token) section for steps to create a secret in the operations experience web UI.
142
143
@@ -228,7 +229,7 @@ Before you configure the data flow endpoint, assign a role to the Azure IoT Oper
228
229
229
230
Then, configure the data flow endpoint with system-assigned managed identity settings.
230
231
231
-
# [Portal](#tab/portal)
232
+
# [Operations experience](#tab/portal)
232
233
233
234
In the operations experience data flow endpoint settings page, select the **Basic** tab then choose **Authentication method** > **System assigned managed identity**.
234
235
@@ -258,7 +259,7 @@ dataLakeStorageSettings:
258
259
259
260
If you need to override the system-assigned managed identity audience, you can specify the `audience` setting.
260
261
261
-
# [Portal](#tab/portal)
262
+
# [Operations experience](#tab/portal)
262
263
263
264
In most cases, you don't need to specify a service audience. Not specifying an audience creates a managed identity with the default audience scoped to your storage account.
264
265
@@ -299,7 +300,7 @@ Before you configure the data flow endpoint, assign a role to the user-assigned
299
300
300
301
Then, configure the data flow endpoint with user-assigned managed identity settings.
301
302
302
-
# [Portal](#tab/portal)
303
+
# [Operations experience](#tab/portal)
303
304
304
305
In the operations experience data flow endpoint settings page, select the **Basic** tab then choose **Authentication method** > **User assigned managed identity**.
305
306
@@ -352,7 +353,7 @@ Get a [SAS token](../../storage/common/storage-sas-overview.md) for an Azure Dat
352
353
353
354
To enhance security and follow the principle of least privilege, you can generate a SAS token for a specific container. To prevent authentication errors, ensure that the container specified in the SAS token matches the data flow destination setting in the configuration.
354
355
355
-
# [Portal](#tab/portal)
356
+
# [Operations experience](#tab/portal)
356
357
357
358
> [!IMPORTANT]
358
359
> To use the operations experience web UI to manage secrets, Azure IoT Operations must first be enabled with secure settings by configuring an Azure Key Vault and enabling workload identities. To learn more, see [Enable secure settings in Azure IoT Operations deployment](../deploy-iot-ops/howto-enable-secure-settings.md).
@@ -419,7 +420,7 @@ Use the `batching` settings to configure the maximum number of messages and the
419
420
420
421
For example, to configure the maximum number of messages to 1000 and the maximum latency to 100 seconds, use the following settings:
421
422
422
-
# [Portal](#tab/portal)
423
+
# [Operations experience](#tab/portal)
423
424
424
425
In the operations experience, select the **Advanced** tab for the data flow endpoint.
Copy file name to clipboardExpand all lines: articles/iot-operations/connect-to-cloud/howto-configure-adx-endpoint.md
+6-6Lines changed: 6 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,7 +6,7 @@ ms.author: patricka
6
6
ms.service: azure-iot-operations
7
7
ms.subservice: azure-data-flows
8
8
ms.topic: how-to
9
-
ms.date: 11/04/2024
9
+
ms.date: 04/03/2025
10
10
ai-usage: ai-assisted
11
11
12
12
#CustomerIntent: As an operator, I want to understand how to configure data flow endpoints for Azure Data Explorer in Azure IoT Operations so that I can send data to Azure Data Explorer.
@@ -65,7 +65,7 @@ If using system-assigned managed identity, in Azure portal, go to your Azure IoT
65
65
66
66
<!-- TODO: use the data ingest URI for host? -->
67
67
68
-
# [Portal](#tab/portal)
68
+
# [Operations experience](#tab/portal)
69
69
70
70
1. In the operations experience, select the **Data flow endpoints** tab.
71
71
1. Under **Create new data flow endpoint**, select **Azure Data Explorer** > **New**.
@@ -172,7 +172,7 @@ Before you configure the data flow endpoint, assign a role to the Azure IoT Oper
172
172
173
173
Then, configure the data flow endpoint with system-assigned managed identity settings.
174
174
175
-
# [Portal](#tab/portal)
175
+
# [Operations experience](#tab/portal)
176
176
177
177
In the operations experience data flow endpoint settings page, select the **Basic** tab then choose **Authentication method** > **System assigned managed identity**.
178
178
@@ -201,7 +201,7 @@ dataExplorerSettings:
201
201
If you need to override the system-assigned managed identity audience, you can specify the `audience` setting.
202
202
203
203
204
-
# [Portal](#tab/portal)
204
+
# [Operations experience](#tab/portal)
205
205
206
206
In most cases, you don't need to specify other settings. This configuration creates a managed identity with the default audience `https://api.kusto.windows.net`.
207
207
@@ -242,7 +242,7 @@ Before you configure the data flow endpoint, assign a role to the user-assigned
242
242
243
243
Then, configure the data flow endpoint with user-assigned managed identity settings.
244
244
245
-
# [Portal](#tab/portal)
245
+
# [Operations experience](#tab/portal)
246
246
247
247
In the operations experience data flow endpoint settings page, select the **Basic** tab then choose **Authentication method** > **User assigned managed identity**.
248
248
@@ -294,7 +294,7 @@ Use the `batching` settings to configure the maximum number of messages and the
294
294
295
295
For example, to configure the maximum number of messages to 1000 and the maximum latency to 100 seconds, use the following settings:
296
296
297
-
# [Portal](#tab/portal)
297
+
# [Operations experience](#tab/portal)
298
298
299
299
In the operations experience, select the **Advanced** tab for the data flow endpoint.
Copy file name to clipboardExpand all lines: articles/iot-operations/connect-to-cloud/howto-configure-dataflow-endpoint.md
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,7 +6,7 @@ ms.author: patricka
6
6
ms.service: azure-iot-operations
7
7
ms.subservice: azure-data-flows
8
8
ms.topic: how-to
9
-
ms.date: 11/01/2024
9
+
ms.date: 04/03/2025
10
10
11
11
#CustomerIntent: As an operator, I want to understand how to configure source and destination endpoints so that I can create a data flow.
12
12
---
@@ -59,7 +59,7 @@ To make it easier to reuse endpoints, the MQTT or Kafka topic filter isn't part
59
59
60
60
For example, you can use the default MQTT broker data flow endpoint. You can use it for both the source and destination with different topic filters:
61
61
62
-
# [Portal](#tab/portal)
62
+
# [Operations experience](#tab/portal)
63
63
64
64
:::image type="content" source="media/howto-configure-dataflow-endpoint/create-dataflow-mq-mq.png" alt-text="Screenshot using operations experience to create a data flow from MQTT to MQTT.":::
65
65
@@ -123,7 +123,7 @@ spec:
123
123
124
124
Similarly, you can create multiple data flows that use the same MQTT endpoint for other endpoints and topics. For example, you can use the same MQTT endpoint for a data flow that sends data to an Event Hubs endpoint.
125
125
126
-
# [Portal](#tab/portal)
126
+
# [Operations experience](#tab/portal)
127
127
128
128
:::image type="content" source="media/howto-configure-dataflow-endpoint/create-dataflow-mq-kafka.png" alt-text="Screenshot using operations experience to create a data flow from MQTT to Kafka.":::
0 commit comments