Skip to content

Commit f8ea37a

Browse files
author
Pat Altimore
committed
Add CLI tab
1 parent c48dce6 commit f8ea37a

File tree

2 files changed

+267
-33
lines changed

2 files changed

+267
-33
lines changed

articles/iot-operations/connect-to-cloud/howto-configure-fabric-real-time-intelligence.md

Lines changed: 252 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ ms.author: patricka
66
ms.service: azure-iot-operations
77
ms.subservice: azure-data-flows
88
ms.topic: how-to
9-
ms.date: 06/12/2025
9+
ms.date: 06/17/2025
1010
ai-usage: ai-assisted
1111

1212
#CustomerIntent: As an operator, I want to understand how to configure data flow endpoints for Microsoft Fabric Real-Time Intelligence in Azure IoT Operations so that I can send real-time data to Microsoft Fabric.
@@ -48,16 +48,12 @@ Retrieve the [Kafka-compatible connection details for the custom endpoint](/fabr
4848

4949
## Create a Microsoft Fabric Real-Time Intelligence data flow endpoint
5050

51-
To configure a data flow endpoint for Microsoft Fabric Real-Time Intelligence, you need to use Simple Authentication and Security Layer (SASL) based authentication.
52-
53-
Azure Key Vault is the recommended way to sync the connection string to the Kubernetes cluster so that it can be referenced in the data flow. [Secure settings](../deploy-iot-ops/howto-enable-secure-settings.md) must be enabled to configure this endpoint using the operations experience web UI.
51+
Microsoft Fabric Real-Time Intelligence, supports Simple Authentication and Security Layer (SASL), System-assigned managed identity, and User-assigned managed identity authentication methods. The following sections describe how to configure a data flow endpoint for Microsoft Fabric Real-Time Intelligence using these authentication methods. For details on the available authentication methods, see [Available authentication methods](#available-authentication-methods).
5452

5553
# [Operations experience](#tab/portal)
5654

5755
1. In the IoT Operations experience portal, select the **Data flow endpoints** tab.
5856
1. Under **Create new data flow endpoint**, select **Microsoft Fabric Real-Time Intelligence** > **New**.
59-
60-
6157
1. Enter the following settings for the endpoint.
6258

6359
:::image type="content" source="media/howto-configure-fabric-real-time-intelligence/event-stream-sasl.png" alt-text="Screenshot using operations experience to create a new Fabric Real-Time Intelligence data flow endpoint.":::
@@ -66,7 +62,7 @@ Azure Key Vault is the recommended way to sync the connection string to the Kube
6662
| --------------------- | ----------------------------------------------------------------- |
6763
| Name | The name of the data flow endpoint. |
6864
| Host | The hostname of the event stream custom endpoint in the format `*.servicebus.windows.net:9093`. Use the bootstrap server address noted previously. |
69-
| Authentication method | *SASL* is currently the only supported authentication method. |
65+
| Authentication method | *SASL* is currently the only supported authentication method in operations experience. Use Azure CLI, Bicep, or Kubernetes manifests to configure other authentication methods. |
7066
| SASL type | Choose *Plain* |
7167
| Synced secret name | Enter a name for the synced secret. A Kubernetes secret with this name is created on the cluster. |
7268

@@ -89,7 +85,7 @@ Azure Key Vault is the recommended way to sync the connection string to the Kube
8985
Use the [az iot ops dataflow endpoint create fabric-realtime](/cli/azure/iot/ops/dataflow/endpoint/apply#az-iot-ops-dataflow-endpoint-create-fabric-realtime) command to create or replace a Microsoft Fabric Real-Time Intelligence data flow endpoint.
9086

9187
```azurecli
92-
az iot ops dataflow endpoint create fabric-realtime --resource-group <ResourceGroupName> --instance <AioInstanceName> --name <EndpointName> --workspace <WorkspaceName> --host <BootstrapServerAddress>
88+
az iot ops dataflow endpoint create fabric-realtime --resource-group <ResourceGroupName> --instance <AioInstanceName> --name <EndpointName> --host "<BootstrapServerAddress>"
9389
```
9490

9591
Here's an example command to create or replace a Microsoft Fabric Real-Time Intelligence data flow endpoint named `fabric-realtime-endpoint`:
@@ -185,6 +181,254 @@ kubectl apply -f <FILE>.yaml
185181

186182
---
187183

184+
## Available authentication methods
185+
186+
The following authentication methods are available for Fabric Real-Time Intelligence data flow endpoints.
187+
188+
### System-assigned managed identity
189+
190+
Before you configure the data flow endpoint, assign a role to the Azure IoT Operations managed identity that grants permission to connect to the Kafka broker:
191+
192+
1. In Azure portal, go to your Azure IoT Operations instance and select **Overview**.
193+
1. Copy the name of the extension listed after **Azure IoT Operations Arc extension**. For example, *azure-iot-operations-xxxx7*.
194+
1. Go to the cloud resource you need to grant permissions. For example, go to the Event Hubs namespace > **Access control (IAM)** > **Add role assignment**.
195+
1. On the **Role** tab, select an appropriate role.
196+
1. On the **Members** tab, for **Assign access to**, select **User, group, or service principal** option, then select **+ Select members** and search for the Azure IoT Operations managed identity. For example, *azure-iot-operations-xxxx7*.
197+
198+
Then, configure the data flow endpoint with system-assigned managed identity settings.
199+
200+
# [Operations experience](#tab/portal)
201+
202+
> [!NOTE] Supported?
203+
204+
In the operations experience data flow endpoint settings page, select the **Basic** tab then choose **Authentication method** > **System assigned managed identity**.
205+
206+
# [Azure CLI](#tab/cli)
207+
208+
#### Create or replace
209+
210+
Use the [az iot ops dataflow endpoint create](/cli/azure/iot/ops/dataflow/endpoint/apply#az-iot-ops-dataflow-endpoint-create) command with the `--auth-type` parameter set to `SystemAssignedManagedIdentity` for system-assigned managed identity authentication.
211+
212+
```azurecli
213+
az iot ops dataflow endpoint create <Command> --auth-type SystemAssignedManagedIdentity --audience <Audience> --resource-group <ResourceGroupName> --instance <AioInstanceName> --name <EndpointName>
214+
```
215+
216+
#### Create or change
217+
218+
Use the [az iot ops dataflow endpoint apply](/cli/azure/iot/ops/dataflow/endpoint/apply#az-iot-ops-dataflow-endpoint-apply) command with the `--config-file` parameter.
219+
220+
In this example, assume a configuration file with the following content:
221+
222+
```json
223+
{
224+
"endpointType": "Kafka",
225+
"kafkaSettings": {
226+
"host": "fabricrealtime.servicebus.windows.net:9093",
227+
"authentication": {
228+
"method": "SystemAssignedManagedIdentity",
229+
"systemAssignedManagedIdentitySettings": {}
230+
},
231+
"tls": {
232+
"mode": "Enabled"
233+
}
234+
}
235+
}
236+
```
237+
238+
# [Bicep](#tab/bicep)
239+
240+
```bicep
241+
kafkaSettings: {
242+
authentication: {
243+
method: 'SystemAssignedManagedIdentity'
244+
systemAssignedManagedIdentitySettings: {}
245+
}
246+
}
247+
```
248+
249+
# [Kubernetes (preview)](#tab/kubernetes)
250+
251+
```yaml
252+
kafkaSettings:
253+
authentication:
254+
method: SystemAssignedManagedIdentity
255+
systemAssignedManagedIdentitySettings:
256+
{}
257+
```
258+
259+
---
260+
261+
### User-assigned managed identity
262+
263+
To use user-assigned managed identity for authentication, you must first deploy Azure IoT Operations with secure settings enabled. Then you need to [set up a user-assigned managed identity for cloud connections](../deploy-iot-ops/howto-enable-secure-settings.md#set-up-a-user-assigned-managed-identity-for-cloud-connections). To learn more, see [Enable secure settings in Azure IoT Operations deployment](../deploy-iot-ops/howto-enable-secure-settings.md).
264+
265+
Before you configure the data flow endpoint, assign a role to the user-assigned managed identity that grants permission to connect to the Kafka broker:
266+
267+
1. In Azure portal, go to the cloud resource you need to grant permissions. For example, go to the Event Grid namespace > **Access control (IAM)** > **Add role assignment**.
268+
1. On the **Role** tab, select an appropriate role.
269+
1. On the **Members** tab, for **Assign access to**, select **Managed identity** option, then select **+ Select members** and search for your user-assigned managed identity.
270+
271+
Then, configure the data flow endpoint with user-assigned managed identity settings.
272+
273+
# [Operations experience](#tab/portal)
274+
275+
In the operations experience data flow endpoint settings page, select the **Basic** tab then choose **Authentication method** > **User assigned managed identity**.
276+
277+
# [Azure CLI](#tab/cli)
278+
279+
#### Create or replace
280+
281+
Use the [az iot ops dataflow endpoint create](/cli/azure/iot/ops/dataflow/endpoint/apply#az-iot-ops-dataflow-endpoint-create) command with the `--auth-type` parameter set to `UserAssignedManagedIdentity` for with user-assigned managed identity authentication.
282+
283+
```azurecli
284+
az iot ops dataflow endpoint create <Command> --auth-type UserAssignedManagedIdentity --client-id <ClientId> --tenant-id <TenantId> --resource-group <ResourceGroupName> --instance <AioInstanceName> --name <EndpointName>
285+
```
286+
287+
#### Create or change
288+
289+
Use the [az iot ops dataflow endpoint apply](/cli/azure/iot/ops/dataflow/endpoint/apply#az-iot-ops-dataflow-endpoint-apply) with the `--config-file` parameter
290+
291+
In this example, assume a configuration file with the following content:
292+
293+
```json
294+
{
295+
"endpointType": "Kafka",
296+
"kafkaSettings": {
297+
"authentication": {
298+
"method": "UserAssignedManagedIdentity",
299+
"userAssignedManagedIdentitySettings": {
300+
"clientId": "<ID>",
301+
"tenantId": "<ID>",
302+
// Optional
303+
"scope": "https://<Scope_Url>"
304+
}
305+
}
306+
}
307+
}
308+
```
309+
310+
# [Bicep](#tab/bicep)
311+
312+
```bicep
313+
kafkaSettings: {
314+
authentication: {
315+
method: 'UserAssignedManagedIdentity'
316+
UserAssignedManagedIdentitySettings: {
317+
clientId: '<CLIENT_ID>'
318+
tenantId: '<TENANT_ID>'
319+
// Optional
320+
// scope: 'https://<SCOPE_URL>'
321+
}
322+
}
323+
...
324+
}
325+
```
326+
327+
# [Kubernetes (preview)](#tab/kubernetes)
328+
329+
```yaml
330+
kafkaSettings:
331+
authentication:
332+
method: UserAssignedManagedIdentity
333+
userAssignedManagedIdentitySettings:
334+
clientId: <CLIENT_ID>
335+
tenantId: <TENANT_ID>
336+
# Optional
337+
# scope: https://<SCOPE_URL>
338+
```
339+
340+
---
341+
342+
### SASL
343+
344+
To use SASL for authentication, specify the SASL authentication method and configure SASL type and a secret reference with the name of the secret that contains the SASL token.
345+
346+
Azure Key Vault is the recommended way to sync the connection string to the Kubernetes cluster so that it can be referenced in the data flow. [Secure settings](../deploy-iot-ops/howto-enable-secure-settings.md) must be enabled to configure this endpoint using the operations experience web UI.
347+
348+
# [Operations experience](#tab/portal)
349+
350+
In the operations experience data flow endpoint settings page, select the **Basic** tab then choose **Authentication method** > **SASL**.
351+
352+
Enter the following settings for the endpoint:
353+
354+
| Setting | Description |
355+
| ------------------------------ | ------------------------------------------------------------------------------------------------- |
356+
| SASL type | The type of SASL authentication to use. Supported types are `Plain`, `ScramSha256`, and `ScramSha512`. |
357+
| Synced secret name | The name of the Kubernetes secret that contains the SASL token. |
358+
| Username reference or token secret | The reference to the username or token secret used for SASL authentication. |
359+
| Password reference of token secret | The reference to the password or token secret used for SASL authentication. |
360+
361+
# [Azure CLI](#tab/cli)
362+
363+
#### Create or replace
364+
365+
Use the [az iot ops dataflow endpoint create](/cli/azure/iot/ops/dataflow/endpoint/apply#az-iot-ops-dataflow-endpoint-create) command with the `--auth-type` parameter set to `Sasl` for SASL authentication.
366+
367+
```azurecli
368+
az iot ops dataflow endpoint create <Command> --auth-type Sasl --sasl-type <SaslType> --secret-name <SecretName> --resource-group <ResourceGroupName> --instance <AioInstanceName> --name <EndpointName>
369+
```
370+
371+
#### Create or change
372+
373+
Use the [az iot ops dataflow endpoint apply](/cli/azure/iot/ops/dataflow/endpoint/apply#az-iot-ops-dataflow-endpoint-apply) with the `--config-file` parameter
374+
375+
In this example, assume a configuration file with the following content:
376+
377+
```json
378+
{
379+
"endpointType": "Kafka",
380+
"kafkaSettings": {
381+
"authentication": {
382+
"method": "Sasl",
383+
"saslSettings": {
384+
"saslType": "<SaslType>",
385+
"secretRef": "<SecretName>"
386+
}
387+
}
388+
}
389+
}
390+
```
391+
392+
# [Bicep](#tab/bicep)
393+
394+
```bicep
395+
kafkaSettings: {
396+
authentication: {
397+
method: 'Sasl' // Or ScramSha256, ScramSha512
398+
saslSettings: {
399+
saslType: 'Plain' // Or ScramSha256, ScramSha512
400+
secretRef: '<SECRET_NAME>'
401+
}
402+
}
403+
}
404+
```
405+
406+
# [Kubernetes (preview)](#tab/kubernetes)
407+
408+
```bash
409+
kubectl create secret generic sasl-secret -n azure-iot-operations \
410+
--from-literal=token='<YOUR_SASL_TOKEN>'
411+
```
412+
413+
```yaml
414+
kafkaSettings:
415+
authentication:
416+
method: Sasl
417+
saslSettings:
418+
saslType: Plain # Or ScramSha256, ScramSha512
419+
secretRef: <SECRET_NAME>
420+
```
421+
422+
---
423+
424+
The supported SASL types are:
425+
426+
- `Plain`
427+
- `ScramSha256`
428+
- `ScramSha512`
429+
430+
The secret must be in the same namespace as the Kafka data flow endpoint. The secret must have the SASL token as a key-value pair.
431+
188432
## Advanced settings
189433

190434
The advanced settings for this endpoint are identical to the [advanced settings for Azure Event Hubs endpoints](howto-configure-kafka-endpoint.md#advanced-settings).

articles/iot-operations/connect-to-cloud/howto-configure-kafka-endpoint.md

Lines changed: 15 additions & 25 deletions
Original file line numberDiff line numberDiff line change
@@ -543,18 +543,6 @@ This configuration creates a managed identity with the default audience, which i
543543

544544
Not supported in the operations experience.
545545

546-
# [Bicep](#tab/bicep)
547-
548-
```bicep
549-
kafkaSettings: {
550-
authentication: {
551-
method: 'SystemAssignedManagedIdentity'
552-
systemAssignedManagedIdentitySettings: {
553-
audience: '<YOUR_AUDIENCE_OVERRIDE_VALUE>'
554-
}
555-
}
556-
}
557-
```
558546

559547
[Azure CLI](#tab/cli)
560548

@@ -572,6 +560,19 @@ kafkaSettings: {
572560
}
573561
```
574562

563+
# [Bicep](#tab/bicep)
564+
565+
```bicep
566+
kafkaSettings: {
567+
authentication: {
568+
method: 'SystemAssignedManagedIdentity'
569+
systemAssignedManagedIdentitySettings: {
570+
audience: '<YOUR_AUDIENCE_OVERRIDE_VALUE>'
571+
}
572+
}
573+
}
574+
```
575+
575576
# [Kubernetes (preview)](#tab/kubernetes)
576577

577578
```yaml
@@ -686,19 +687,9 @@ Enter the following settings for the endpoint:
686687
| Username reference or token secret | The reference to the username or token secret used for SASL authentication. |
687688
| Password reference of token secret | The reference to the password or token secret used for SASL authentication. |
688689

689-
# [Bicep](#tab/bicep)
690+
# [Azure CLI](#tab/cli)
690691

691-
```bicep
692-
kafkaSettings: {
693-
authentication: {
694-
method: 'Sasl' // Or ScramSha256, ScramSha512
695-
saslSettings: {
696-
saslType: 'Plain' // Or ScramSha256, ScramSha512
697-
secretRef: '<SECRET_NAME>'
698-
}
699-
}
700-
}
701-
```
692+
#### Create or replace
702693

703694
Use the [az iot ops dataflow endpoint create](/cli/azure/iot/ops/dataflow/endpoint/apply#az-iot-ops-dataflow-endpoint-create) command with the `--auth-type` parameter set to `Sasl` for SASL authentication.
704695

@@ -752,7 +743,6 @@ The supported SASL types are:
752743
- `ScramSha512`
753744

754745
The secret must be in the same namespace as the Kafka data flow endpoint. The secret must have the SASL token as a key-value pair.
755-
<!-- TODO: double check! Provide an example? -->
756746

757747
### Anonymous
758748

0 commit comments

Comments
 (0)