Skip to content

Commit 57ae0b4

Browse files
Merge pull request #289830 from PatAltimore/patricka-m3-dataflow
Add dataflow endpoint system identity steps
2 parents cd396b4 + 5cf93f5 commit 57ae0b4

File tree

5 files changed

+75
-65
lines changed

5 files changed

+75
-65
lines changed

articles/iot-operations/connect-to-cloud/howto-configure-adlsv2-endpoint.md

Lines changed: 15 additions & 21 deletions
Original file line numberDiff line numberDiff line change
@@ -29,14 +29,6 @@ To send data to Azure Data Lake Storage Gen2 in Azure IoT Operations Preview, yo
2929

3030
To configure a dataflow endpoint for Azure Data Lake Storage Gen2, we suggest using the managed identity of the Azure Arc-enabled Kubernetes cluster. This approach is secure and eliminates the need for secret management. Alternatively, you can authenticate with the storage account using an access token. When using an access token, you would need to create a Kubernetes secret containing the SAS token.
3131

32-
### Use managed identity authentication
33-
34-
First, in Azure portal, go to the Arc-connected Kubernetes cluster and select **Settings** > **Extensions**. In the extension list, find the name of your Azure IoT Operations extension. Copy the name of the extension.
35-
36-
Then, assign a role to the managed identity that grants permission to write to the storage account, such as *Storage Blob Data Contributor*. To learn more, see [Authorize access to blobs using Microsoft Entra ID](../../storage/blobs/authorize-access-azure-active-directory.md).
37-
38-
Finally, create the *DataflowEndpoint* resource and specify the managed identity authentication method. Replace the placeholder values like `<ENDPOINT_NAME>` with your own.
39-
4032
# [Portal](#tab/portal)
4133

4234
1. In the IoT Operations portal, select the **Dataflow endpoints** tab.
@@ -230,11 +222,15 @@ For more information about enabling secure settings by configuring an Azure Key
230222

231223
### System-assigned managed identity
232224

233-
Using the system-assigned managed identity is the recommended authentication method for Azure IoT Operations. Azure IoT Operations creates the managed identity automatically and assigns it to the Azure Arc-enabled Kubernetes cluster. It eliminates the need for secret management and allows for seamless authentication with the Azure Data Lake Storage Gen2 account.
225+
Using the system-assigned managed identity is the recommended authentication method for Azure IoT Operations. Azure IoT Operations creates the managed identity automatically and assigns it to the Azure Arc-enabled Kubernetes cluster. It eliminates the need for secret management and allows for seamless authentication.
234226

235227
Before creating the dataflow endpoint, assign a role to the managed identity that has write permission to the storage account. For example, you can assign the *Storage Blob Data Contributor* role. To learn more about assigning roles to blobs, see [Authorize access to blobs using Microsoft Entra ID](../../storage/blobs/authorize-access-azure-active-directory.md).
236228

237-
To use system-assigned managed identity, specify the managed identity authentication method in the *DataflowEndpoint* resource. In most cases, you don't need to specify other settings. Not specifying an audience creates a managed identity with the default audience scoped to your storage account.
229+
1. In Azure portal, go to your Azure IoT Operations instance and select **Overview**.
230+
1. Copy the name of the extension listed after **Azure IoT Operations Arc extension**. For example, *azure-iot-operations-xxxx7*.
231+
1. Search for the managed identity in the Azure portal by using the name of the extension. For example, search for *azure-iot-operations-xxxx7*.
232+
1. Assign a role to the Azure IoT Operations Arc extension managed identity that grants permission to write to the storage account, such as *Storage Blob Data Contributor*. To learn more, see [Authorize access to blobs using Microsoft Entra ID](../../storage/blobs/authorize-access-azure-active-directory.md).
233+
1. Create the *DataflowEndpoint* resource and specify the managed identity authentication method.
238234

239235
# [Portal](#tab/portal)
240236

@@ -309,23 +305,14 @@ Get a [SAS token](../../storage/common/storage-sas-overview.md) for an Azure Dat
309305

310306
To enhance security and follow the principle of least privilege, you can generate a SAS token for a specific container. To prevent authentication errors, ensure that the container specified in the SAS token matches the dataflow destination setting in the configuration.
311307

312-
Create a Kubernetes secret with the SAS token.
313-
314-
```bash
315-
kubectl create secret generic <SAS_SECRET_NAME> -n azure-iot-operations \
316-
--from-literal=accessToken='sv=2022-11-02&ss=b&srt=c&sp=rwdlax&se=2023-07-22T05:47:40Z&st=2023-07-21T21:47:40Z&spr=https&sig=<signature>'
317-
```
318-
319-
You can also use the IoT Operations portal to create and manage the secret. To learn more, see [Create and manage secrets in Azure IoT Operations Preview](../deploy-iot-ops/howto-manage-secrets.md).
320-
321-
Finally, create the *DataflowEndpoint* resource with the secret reference.
322-
323308
# [Portal](#tab/portal)
324309

325310
In the operations experience dataflow endpoint settings page, select the **Basic** tab then choose **Authentication method** > **Access token**.
326311

327312
Enter the access token secret name you created in **Access token secret name**.
328313

314+
To learn more about secrets, see [Create and manage secrets in Azure IoT Operations Preview](../secure-iot-ops/howto-manage-secrets.md).
315+
329316
# [Bicep](#tab/bicep)
330317

331318
```bicep
@@ -341,6 +328,13 @@ dataLakeStorageSettings: {
341328

342329
# [Kubernetes](#tab/kubernetes)
343330

331+
Create a Kubernetes secret with the SAS token.
332+
333+
```bash
334+
kubectl create secret generic <SAS_SECRET_NAME> -n azure-iot-operations \
335+
--from-literal=accessToken='sv=2022-11-02&ss=b&srt=c&sp=rwdlax&se=2023-07-22T05:47:40Z&st=2023-07-21T21:47:40Z&spr=https&sig=<signature>'
336+
```
337+
344338
```yaml
345339
dataLakeStorageSettings:
346340
authentication:

articles/iot-operations/connect-to-cloud/howto-configure-adx-endpoint.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ ms.author: patricka
66
ms.service: azure-iot-operations
77
ms.subservice: azure-data-flows
88
ms.topic: how-to
9-
ms.date: 10/30/2024
9+
ms.date: 11/04/2024
1010
ai-usage: ai-assisted
1111

1212
#CustomerIntent: As an operator, I want to understand how to configure dataflow endpoints for Azure Data Explorer in Azure IoT Operations so that I can send data to Azure Data Explorer.
@@ -159,7 +159,7 @@ To use these authentication methods, the Azure IoT Operations Arc extension must
159159

160160
### System-assigned managed identity
161161

162-
Using the system-assigned managed identity is the recommended authentication method for Azure IoT Operations. Azure IoT Operations creates the managed identity automatically and assigns it to the Azure Arc-enabled Kubernetes cluster. It eliminates the need for secret management and allows for seamless authentication with Azure Data Explorer.
162+
Using the system-assigned managed identity is the recommended authentication method for Azure IoT Operations. Azure IoT Operations creates the managed identity automatically and assigns it to the Azure Arc-enabled Kubernetes cluster. It eliminates the need for secret management and allows for seamless authentication.
163163

164164
In the *DataflowEndpoint* resource, specify the managed identity authentication method. In most cases, you don't need to specify other settings. This configuration creates a managed identity with the default audience `https://api.kusto.windows.net`.
165165

articles/iot-operations/connect-to-cloud/howto-configure-fabric-endpoint.md

Lines changed: 6 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ ms.author: patricka
66
ms.service: azure-iot-operations
77
ms.subservice: azure-data-flows
88
ms.topic: how-to
9-
ms.date: 10/30/2024
9+
ms.date: 11/04/2024
1010
ai-usage: ai-assisted
1111

1212
#CustomerIntent: As an operator, I want to understand how to configure dataflow endpoints for Microsoft Fabric OneLake in Azure IoT Operations so that I can send data to Microsoft Fabric OneLake.
@@ -32,9 +32,10 @@ To send data to Microsoft Fabric OneLake in Azure IoT Operations Preview, you ca
3232

3333
To configure a dataflow endpoint for Microsoft Fabric OneLake, we suggest using the managed identity of the Azure Arc-enabled Kubernetes cluster. This approach is secure and eliminates the need for secret management.
3434

35-
1. In Azure portal, go to the Arc-connected Kubernetes cluster and select **Settings** > **Extensions**. In the extension list, find the name of your Azure IoT Operations extension. Copy the name of the extension.
36-
1. In the Microsoft Fabric workspace you created, select **Manage access** > **+ Add people or groups**. Search for the Azure IoT Operations Preview Arc extension by its name and select it. Select **Contributor** as the role, then select **Add**.
37-
1. reate the *DataflowEndpoint* resource and specify the managed identity authentication method. Replace the placeholder values like `<ENDPOINT_NAME>` with your own.
35+
1. In Azure portal, go to your Azure IoT Operations instance and select **Overview**.
36+
1. Copy the name of the extension listed after **Azure IoT Operations Arc extension**. For example, *azure-iot-operations-xxxx7*.
37+
1. In the Microsoft Fabric workspace you created, select **Manage access** > **+ Add people or groups**. Search for the Azure IoT Operations Arc extension by its name and select it. Select **Contributor** as the role, then select **Add**.
38+
1. Create the *DataflowEndpoint* resource and specify the managed identity authentication method.
3839

3940
# [Portal](#tab/portal)
4041

@@ -175,7 +176,7 @@ To learn more, see [Give access to a workspace](/fabric/get-started/give-access-
175176
176177
### System-assigned managed identity
177178
178-
Using the system-assigned managed identity is the recommended authentication method for Azure IoT Operations. Azure IoT Operations creates the managed identity automatically and assigns it to the Azure Arc-enabled Kubernetes cluster. It eliminates the need for secret management and allows for seamless authentication with Azure Data Explorer.
179+
Using the system-assigned managed identity is the recommended authentication method for Azure IoT Operations. Azure IoT Operations creates the managed identity automatically and assigns it to the Azure Arc-enabled Kubernetes cluster. It eliminates the need for secret management and allows for seamless authentication.
179180
180181
In the *DataflowEndpoint* resource, specify the managed identity authentication method. In most cases, you don't need to specify other settings. This configuration creates a managed identity with the default audience.
181182

articles/iot-operations/connect-to-cloud/howto-configure-kafka-endpoint.md

Lines changed: 38 additions & 27 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ ms.author: patricka
66
ms.service: azure-iot-operations
77
ms.subservice: azure-data-flows
88
ms.topic: how-to
9-
ms.date: 10/30/2024
9+
ms.date: 11/04/2024
1010
ai-usage: ai-assisted
1111

1212
#CustomerIntent: As an operator, I want to understand how to configure dataflow endpoints for Kafka in Azure IoT Operations so that I can send data to and from Kafka endpoints.
@@ -35,7 +35,12 @@ Next, [create an event hub in the namespace](../../event-hubs/event-hubs-create.
3535

3636
### Assign the managed identity to the Event Hubs namespace
3737

38-
To configure a dataflow endpoint for a Kafka endpoint, we recommend using the managed identity of the Azure Arc-enabled Kubernetes cluster. This approach is secure and eliminates the need for secret management. In Azure portal, go to the Arc-connected Kubernetes cluster and select **Settings** > **Extensions**. In the extension list, find the name of your Azure IoT Operations extension. Copy the name of the extension. Then, assign the managed identity to the Event Hubs namespace with the `Azure Event Hubs Data Sender` or `Azure Event Hubs Data Receiver` role using the name of the extension.
38+
To configure a dataflow endpoint for a Kafka endpoint, we recommend using the managed identity of the Azure Arc-enabled Kubernetes cluster. This approach is secure and eliminates the need for secret management.
39+
40+
1. In Azure portal, go to your Azure IoT Operations instance and select **Overview**.
41+
1. Copy the name of the extension listed after **Azure IoT Operations Arc extension**. For example, *azure-iot-operations-xxxx7*.
42+
1. Search for the managed identity in the Azure portal by using the name of the extension. For example, search for *azure-iot-operations-xxxx7*.
43+
1. Assign the Azure IoT Operations Arc extension managed identity to the Event Hubs namespace with the `Azure Event Hubs Data Sender` or `Azure Event Hubs Data Receiver` role.
3944

4045
### Create dataflow endpoint
4146

@@ -137,18 +142,6 @@ kubectl apply -f <FILE>.yaml
137142
138143
### Use connection string for authentication to Event Hubs
139144

140-
To use connection string for authentication to Event Hubs, use the SASL authentication method and configure with SASL type as "Plain" and configure name of the secret that contains the connection string.
141-
142-
First, create a Kubernetes secret that contains the connection string. The secret must be in the same namespace as the Kafka dataflow endpoint. The secret must have both the username and password as key-value pairs. For example:
143-
144-
```bash
145-
kubectl create secret generic <SECRET_NAME> -n azure-iot-operations \
146-
--from-literal=username='$ConnectionString' \
147-
--from-literal=password='Endpoint=sb://<NAMESPACE>.servicebus.windows.net/;SharedAccessKeyName=<KEY-NAME>;SharedAccessKey=<KEY>'
148-
```
149-
> [!TIP]
150-
> Scoping the connection string to the namespace (as opposed to individual event hubs) allows a dataflow to send and receive messages from multiple different event hubs and Kafka topics.
151-
152145
# [Portal](#tab/portal)
153146

154147
In the operations experience dataflow endpoint settings page, select the **Basic** tab then choose **Authentication method** > **SASL**.
@@ -181,6 +174,18 @@ kafkaSettings: {
181174

182175
# [Kubernetes](#tab/kubernetes)
183176

177+
To use connection string for authentication to Event Hubs, use the SASL authentication method and configure with SASL type as "Plain" and configure name of the secret that contains the connection string.
178+
179+
First, create a Kubernetes secret that contains the connection string. The secret must be in the same namespace as the Kafka dataflow endpoint. The secret must have both the username and password as key-value pairs. For example:
180+
181+
```bash
182+
kubectl create secret generic <SECRET_NAME> -n azure-iot-operations \
183+
--from-literal=username='$ConnectionString' \
184+
--from-literal=password='Endpoint=sb://<NAMESPACE>.servicebus.windows.net/;SharedAccessKeyName=<KEY-NAME>;SharedAccessKey=<KEY>'
185+
```
186+
> [!TIP]
187+
> Scoping the connection string to the namespace (as opposed to individual event hubs) allows a dataflow to send and receive messages from multiple different event hubs and Kafka topics.
188+
184189
```yaml
185190
kafkaSettings:
186191
authentication:
@@ -320,6 +325,11 @@ kafkaSettings: {
320325

321326
# [Kubernetes](#tab/kubernetes)
322327

328+
```bash
329+
kubectl create secret generic sasl-secret -n azure-iot-operations \
330+
--from-literal=token='<YOUR_SASL_TOKEN>'
331+
```
332+
323333
```yaml
324334
kafkaSettings:
325335
authentication:
@@ -339,10 +349,6 @@ The supported SASL types are:
339349

340350
The secret must be in the same namespace as the Kafka dataflow endpoint. The secret must have the SASL token as a key-value pair. For example:
341351

342-
```bash
343-
kubectl create secret generic sasl-secret -n azure-iot-operations \
344-
--from-literal=token='<YOUR_SASL_TOKEN>'
345-
```
346352

347353
<!-- TODO: double check! -->
348354

@@ -379,6 +385,14 @@ kafkaSettings: {
379385

380386
# [Kubernetes](#tab/kubernetes)
381387

388+
The secret must be in the same namespace as the Kafka dataflow endpoint. Use Kubernetes TLS secret containing the public certificate and private key. For example:
389+
390+
```bash
391+
kubectl create secret tls my-tls-secret -n azure-iot-operations \
392+
--cert=path/to/cert/file \
393+
--key=path/to/key/file
394+
```
395+
382396
```yaml
383397
kafkaSettings:
384398
authentication:
@@ -389,19 +403,16 @@ kafkaSettings:
389403

390404
---
391405

392-
The secret must be in the same namespace as the Kafka dataflow endpoint. Use Kubernetes TLS secret containing the public certificate and private key. For example:
393-
394-
```bash
395-
kubectl create secret tls my-tls-secret -n azure-iot-operations \
396-
--cert=path/to/cert/file \
397-
--key=path/to/key/file
398-
```
399406

400407
### System-assigned managed identity
401408

402-
To use system-assigned managed identity for authentication, first assign a role to the Azure IoT Operation managed identity that grants permission to send and receive messages from Event Hubs, such as Azure Event Hubs Data Owner or Azure Event Hubs Data Sender/Receiver. To learn more, see [Authenticate an application with Microsoft Entra ID to access Event Hubs resources](../../event-hubs/authenticate-application.md#built-in-roles-for-azure-event-hubs).
409+
To use system-assigned managed identity for authentication, assign a role to the Azure IoT Operation managed identity that grants permission to send and receive messages from Event Hubs.
403410

404-
Then, specify the managed identity authentication method in the Kafka settings. In most cases, you don't need to specify other settings.
411+
1. In Azure portal, go to your Azure IoT Operations instance and select **Overview**.
412+
1. Copy the name of the extension listed after **Azure IoT Operations Arc extension**. For example, *azure-iot-operations-xxxx7*.
413+
1. Search for the managed identity in the Azure portal by using the name of the extension. For example, search for *azure-iot-operations-xxxx7*.
414+
1. Assign a role to the Azure IoT Operations Arc extension managed identity that grants permission to send and receive messages such as *Azure Event Hubs Data Owner*, *Azure Event Hubs Data Sender*, or *Azure Event Hubs Data Receiver*. To learn more, see [Authenticate an application with Microsoft Entra ID to access Event Hubs resources](../../event-hubs/authenticate-application.md#built-in-roles-for-azure-event-hubs).
415+
1. Specify the managed identity authentication method in the Kafka settings. In most cases, you don't need to specify other settings.
405416

406417
# [Portal](#tab/portal)
407418

articles/iot-operations/connect-to-cloud/howto-configure-mqtt-endpoint.md

Lines changed: 14 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -415,14 +415,6 @@ The following authentication methods are available for MQTT broker dataflow endp
415415

416416
Many MQTT brokers, like Event Grid, support X.509 authentication. Dataflows can present a client X.509 certificate and negotiate the TLS communication.
417417

418-
To use X.509 certificate authentication, you need to create a secret with the certificate and private key. Use the Kubernetes TLS secret containing the public certificate and private key. For example:
419-
420-
```bash
421-
kubectl create secret tls my-tls-secret -n azure-iot-operations \
422-
--cert=path/to/cert/file \
423-
--key=path/to/key/file
424-
```
425-
426418
# [Portal](#tab/portal)
427419

428420
In the operations experience dataflow endpoint settings page, select the **Basic** tab then choose **Authentication method** > **X509 certificate**.
@@ -449,6 +441,14 @@ mqttSettings: {
449441

450442
# [Kubernetes](#tab/kubernetes)
451443

444+
To use X.509 certificate authentication, you need to create a secret with the certificate and private key. Use the Kubernetes TLS secret containing the public certificate and private key. For example:
445+
446+
```bash
447+
kubectl create secret tls my-tls-secret -n azure-iot-operations \
448+
--cert=path/to/cert/file \
449+
--key=path/to/key/file
450+
```
451+
452452
```yaml
453453
mqttSettings:
454454
authentication:
@@ -463,9 +463,13 @@ mqttSettings:
463463

464464
To use system-assigned managed identity for authentication, you don't need to create a secret. The system-assigned managed identity is used to authenticate with the MQTT broker.
465465

466-
Before you configure the endpoint, make sure that the Azure IoT Operations managed identity has the necessary permissions to connect to the MQTT broker. For example, with Azure Event Grid MQTT broker, assign the managed identity to the Event Grid namespace or topic space with [an appropriate role](../../event-grid/mqtt-client-microsoft-entra-token-and-rbac.md#authorization-to-grant-access-permissions).
466+
Before you configure the endpoint, make sure that the Azure IoT Operations managed identity has the necessary permissions to connect to the MQTT broker.
467467

468-
Then, configure the endpoint with system-assigned managed identity settings.
468+
1. In Azure portal, go to your Azure IoT Operations instance and select **Overview**.
469+
1. Copy the name of the extension listed after **Azure IoT Operations Arc extension**. For example, *azure-iot-operations-xxxx7*.
470+
1. Search for the managed identity in the Azure portal by using the name of the extension. For example, search for *azure-iot-operations-xxxx7*.
471+
1. Assign a role to the Azure IoT Operations Arc extension managed identity that grants permission to connect to the MQTT broker. For example, with Azure Event Grid MQTT broker, assign the managed identity to the Event Grid namespace or topic space with [an appropriate role](../../event-grid/mqtt-client-microsoft-entra-token-and-rbac.md#authorization-to-grant-access-permissions).
472+
1. Configure the endpoint with system-assigned managed identity settings.
469473

470474
# [Portal](#tab/portal)
471475

0 commit comments

Comments
 (0)