Skip to content

Commit b07241f

Browse files
committed
Remove portal
1 parent bb3c217 commit b07241f

File tree

5 files changed

+5
-269
lines changed

5 files changed

+5
-269
lines changed

articles/iot-operations/connect-to-cloud/howto-configure-adlsv2-endpoint.md

Lines changed: 1 addition & 77 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ author: PatAltimore
55
ms.author: patricka
66
ms.subservice: azure-data-flows
77
ms.topic: how-to
8-
ms.date: 09/23/2024
8+
ms.date: 10/02/2024
99
ai-usage: ai-assisted
1010

1111
#CustomerIntent: As an operator, I want to understand how to configure dataflow endpoints for Azure Data Lake Storage Gen2 in Azure IoT Operations so that I can send data to Azure Data Lake Storage Gen2.
@@ -27,28 +27,6 @@ To send data to Azure Data Lake Storage Gen2 in Azure IoT Operations Preview, yo
2727

2828
To configure a dataflow endpoint for Azure Data Lake Storage Gen2, we suggest using the managed identity of the Azure Arc-enabled Kubernetes cluster. This approach is secure and eliminates the need for secret management. Alternatively, you can authenticate with the storage account using an access token. When using an access token, you would need to create a Kubernetes secret containing the SAS token.
2929

30-
# [Portal](#tab/portal)
31-
32-
1. In the operations experience portal, select the **Dataflow endpoints** tab.
33-
1. Under **Create new dataflow endpoint**, select **Azure Data Lake Storage (2nd generation)** > **New**.
34-
35-
:::image type="content" source="media/howto-configure-adlsv2-endpoint/create-adls-endpoint.png" alt-text="Screenshot using operations experience portal to create a new ADLS V2 dataflow endpoint.":::
36-
37-
1. Enter the following settings for the endpoint:
38-
39-
| Setting | Description |
40-
| --------------------- | ------------------------------------------------------------------------------------------------- |
41-
| Name | The name of the dataflow endpoint. |
42-
| Host | The hostname of the Azure Data Lake Storage Gen2 endpoint in the format `<account>.blob.core.windows.net`. Replace the account placeholder with the endpoint account name. |
43-
| Authentication method | The method used for authentication. Choose *System assigned managed identity*, *User assigned managed identity*, or *Access token*. |
44-
| Client ID | The client ID of the user-assigned managed identity. Required if using *User assigned managed identity*. |
45-
| Tenant ID | The tenant ID of the user-assigned managed identity. Required if using *User assigned managed identity*. |
46-
| Access token secret name | The name of the Kubernetes secret containing the SAS token. Required if using *Access token*. |
47-
48-
1. Select **Apply** to provision the endpoint.
49-
50-
# [Kubernetes](#tab/kubernetes)
51-
5230
### Use managed identity authentication
5331

5432
1. Get the managed identity of the Azure IoT Operations Preview Arc extension.
@@ -94,22 +72,10 @@ If you need to override the system-assigned managed identity audience, see the [
9472
secretRef: my-sas
9573
```
9674
97-
---
98-
9975
## Configure dataflow destination
10076
10177
Once the endpoint is created, you can use it in a dataflow by specifying the endpoint name in the dataflow's destination settings. The following example is a dataflow configuration that uses the MQTT endpoint for the source and Azure Data Lake Storage Gen2 as the destination. The source data is from the MQTT topics `thermostats/+/telemetry/temperature/#` and `humidifiers/+/telemetry/humidity/#`. The destination sends the data to Azure Data Lake Storage table `telemetryTable`.
10278

103-
# [Portal](#tab/portal)
104-
105-
1. In the Azure IoT Operations Preview portal, create a new dataflow or edit an existing dataflow by selecting the **Dataflows** tab on the left. If creating a new dataflow, select a source for the dataflow.
106-
1. In the editor, select the destination dataflow endpoint.
107-
1. Choose the Azure Data Lake Storage Gen2 endpoint that you created previously.
108-
109-
:::image type="content" source="media/howto-configure-adlsv2-endpoint/dataflow-mq-adls.png" alt-text="Screenshot using operations experience portal to create a dataflow with an MQTT source and ADLS V2 destination.":::
110-
111-
# [Kubernetes](#tab/kubernetes)
112-
11379
```yaml
11480
apiVersion: connectivity.iotoperations.azure.com/v1beta1
11581
kind: Dataflow
@@ -132,8 +98,6 @@ spec:
13298
dataDestination: telemetryTable
13399
```
134100

135-
---
136-
137101
For more information about dataflow destination settings, see [Create a dataflow](howto-create-dataflow.md).
138102

139103
> [!NOTE]
@@ -151,14 +115,6 @@ Using the system-assigned managed identity is the recommended authentication met
151115

152116
Before creating the dataflow endpoint, assign a role to the managed identity that has write permission to the storage account. For example, you can assign the *Storage Blob Data Contributor* role. To learn more about assigning roles to blobs, see [Authorize access to blobs using Microsoft Entra ID](../../storage/blobs/authorize-access-azure-active-directory.md).
153117

154-
# [Portal](#tab/portal)
155-
156-
In the operations experience portal dataflow endpoint settings page, select the **Basic** tab then choose **Authentication method** > **System assigned managed identity**.
157-
158-
In most cases, you don't need to specify a service audience. Not specifying an audience creates a managed identity with the default audience scoped to your storage account.
159-
160-
# [Kubernetes](#tab/kubernetes)
161-
162118
In the *DataflowEndpoint* resource, specify the managed identity authentication method. In most cases, you don't need to specify other settings. Not specifying an audience creates a managed identity with the default audience scoped to your storage account.
163119

164120
```yaml
@@ -178,8 +134,6 @@ datalakeStorageSettings:
178134
audience: https://<account>.blob.core.windows.net
179135
```
180136

181-
---
182-
183137
#### Access token
184138

185139
Using an access token is an alternative authentication method. This method requires you to create a Kubernetes secret with the SAS token and reference the secret in the *DataflowEndpoint* resource.
@@ -202,14 +156,6 @@ kubectl create secret generic my-sas \
202156
-n azure-iot-operations
203157
```
204158

205-
# [Portal](#tab/portal)
206-
207-
In the operations experience portal dataflow endpoint settings page, select the **Basic** tab then choose **Authentication method** > **Access token**.
208-
209-
Enter the access token secret name you created in **Access token secret name**.
210-
211-
# [Kubernetes](#tab/kubernetes)
212-
213159
Create the *DataflowEndpoint* resource with the secret reference.
214160

215161
```yaml
@@ -220,18 +166,8 @@ datalakeStorageSettings:
220166
secretRef: my-sas
221167
```
222168

223-
---
224-
225169
#### User-assigned managed identity
226170

227-
# [Portal](#tab/portal)
228-
229-
In the operations experience portal dataflow endpoint settings page, select the **Basic** tab then choose **Authentication method** > **User assigned managed identity**.
230-
231-
Enter the user assigned managed identity client ID and tenant ID in the appropriate fields.
232-
233-
# [Kubernetes](#tab/kubernetes)
234-
235171
To use a user-assigned managed identity, specify the `UserAssignedManagedIdentity` authentication method and provide the `clientId` and `tenantId` of the managed identity.
236172

237173
```yaml
@@ -243,8 +179,6 @@ datalakeStorageSettings:
243179
tenantId: <ID>
244180
```
245181

246-
---
247-
248182
## Advanced settings
249183

250184
You can set advanced settings for the Azure Data Lake Storage Gen2 endpoint, such as the batching latency and message count.
@@ -258,14 +192,6 @@ Use the `batching` settings to configure the maximum number of messages and the
258192

259193
For example, to configure the maximum number of messages to 1000 and the maximum latency to 100 seconds, use the following settings:
260194

261-
# [Portal](#tab/portal)
262-
263-
In the operations experience portal, select the **Advanced** tab for the dataflow endpoint.
264-
265-
:::image type="content" source="media/howto-configure-adlsv2-endpoint/adls-advanced.png" alt-text="Screenshot using operations experience portal to set ADLS V2 advanced settings.":::
266-
267-
# [Kubernetes](#tab/kubernetes)
268-
269195
Set the values in the dataflow endpoint custom resource.
270196

271197
```yaml
@@ -274,5 +200,3 @@ datalakeStorageSettings:
274200
latencySeconds: 100
275201
maxMessages: 1000
276202
```
277-
278-
---

articles/iot-operations/connect-to-cloud/howto-configure-adx-endpoint.md

Lines changed: 0 additions & 66 deletions
Original file line numberDiff line numberDiff line change
@@ -56,25 +56,6 @@ To send data to Azure Data Explorer in Azure IoT Operations Preview, you can con
5656
5757
Create the dataflow endpoint resource with your cluster and database information. We suggest using the managed identity of the Azure Arc-enabled Kubernetes cluster. This approach is secure and eliminates the need for secret management.
5858
59-
# [Portal](#tab/portal)
60-
61-
1. In the operations experience portal, select the **Dataflow endpoints** tab.
62-
1. Under **Create new dataflow endpoint**, select **Azure Data Explorer** > **New**.
63-
64-
:::image type="content" source="media/howto-configure-adx-endpoint/create-adx-endpoint.png" alt-text="Screenshot using operations experience portal to create an Azure Data Explorer dataflow endpoint.":::
65-
66-
1. Enter the following settings for the endpoint:
67-
68-
| Setting | Description |
69-
| --------------------- | ------------------------------------------------------------------------------------------------- |
70-
| Name | The name of the dataflow endpoint. |
71-
| Host | The hostname of the Azure Data Explorer endpoint in the format `<cluster>.<region>.kusto.windows.net`. |
72-
| Authentication method | The method used for authentication. Choose *System assigned managed identity* or *User assigned managed identity* |
73-
| Client ID | The client ID of the user-assigned managed identity. Required if using *User assigned managed identity*. |
74-
| Tenant ID | The tenant ID of the user-assigned managed identity. Required if using *User assigned managed identity*. |
75-
76-
# [Kubernetes](#tab/kubernetes)
77-
7859
```yaml
7960
apiVersion: connectivity.iotoperations.azure.com/v1beta1
8061
kind: DataflowEndpoint
@@ -91,25 +72,10 @@ spec:
9172
systemAssignedManagedIdentitySettings: {}
9273
```
9374

94-
---
95-
9675
## Configure dataflow destination
9776

9877
Once the endpoint is created, you can use it in a dataflow by specifying the endpoint name in the dataflow's destination settings.
9978

100-
# [Portal](#tab/portal)
101-
102-
1. In the Azure IoT Operations Preview portal, create a new dataflow or edit an existing dataflow by selecting the **Dataflows** tab on the left. If creating a new dataflow, select a source for the dataflow.
103-
1. In the editor, select the destination dataflow endpoint.
104-
1. Choose the Azure Data Explorer endpoint that you created previously.
105-
106-
:::image type="content" source="media/howto-configure-adx-endpoint/dataflow-mq-adx.png" alt-text="Screenshot using operations experience portal to create a dataflow with an MQTT source and a Azure Data Explorer destination.":::
107-
108-
1. Specify an output schema for the data. The schema must match the table schema in Azure Data Explorer. You can select an existing schema or upload a new schema to the schema registry.
109-
1. Select **Apply** to provision the dataflow.
110-
111-
# [Kubernetes](#tab/kubernetes)
112-
11379
```yaml
11480
apiVersion: connectivity.iotoperations.azure.com/v1beta1
11581
kind: Dataflow
@@ -132,8 +98,6 @@ spec:
13298
dataDestination: database-name
13399
```
134100
135-
---
136-
137101
For more information about dataflow destination settings, see [Create a dataflow](howto-create-dataflow.md).
138102
139103
> [!NOTE]
@@ -151,14 +115,6 @@ Using the system-assigned managed identity is the recommended authentication met
151115
152116
Before you create the dataflow endpoint, assign a role to the managed identity that grants permission to write to the Azure Data Explorer database. For more information on adding permissions, see [Manage Azure Data Explorer cluster permissions](/azure/data-explorer/manage-cluster-permissions).
153117
154-
# [Portal](#tab/portal)
155-
156-
In the operations experience portal dataflow endpoint settings page, select the **Basic** tab then choose **Authentication method** > **System assigned managed identity**.
157-
158-
In most cases, you don't need to specify a service audience. Not specifying an audience creates a managed identity with the default audience scoped to your storage account.
159-
160-
# [Kubernetes](#tab/kubernetes)
161-
162118
In the *DataflowEndpoint* resource, specify the managed identity authentication method. In most cases, you don't need to specify other settings. This configuration creates a managed identity with the default audience `https://api.kusto.windows.net`.
163119

164120
```yaml
@@ -178,18 +134,8 @@ dataExplorerSettings:
178134
audience: https://<audience URL>
179135
```
180136

181-
---
182-
183137
#### User-assigned managed identity
184138

185-
# [Portal](#tab/portal)
186-
187-
In the operations experience portal dataflow endpoint settings page, select the **Basic** tab then choose **Authentication method** > **User assigned managed identity**.
188-
189-
Enter the user assigned managed identity client ID and tenant ID in the appropriate fields.
190-
191-
# [Kubernetes](#tab/kubernetes)
192-
193139
To use a user-assigned managed identity, specify the `UserAssignedManagedIdentity` authentication method and provide the `clientId` and `tenantId` of the managed identity.
194140

195141
```yaml
@@ -201,8 +147,6 @@ dataExplorerSettings:
201147
tenantId: <ID>
202148
```
203149

204-
---
205-
206150
## Advanced settings
207151

208152
You can set advanced settings for the Azure Data Explorer endpoint, such as the batching latency and message count.
@@ -216,14 +160,6 @@ Use the `batching` settings to configure the maximum number of messages and the
216160

217161
For example, to configure the maximum number of messages to 1000 and the maximum latency to 100 seconds, use the following settings:
218162

219-
# [Portal](#tab/portal)
220-
221-
In the operations experience portal, select the **Advanced** tab for the dataflow endpoint.
222-
223-
:::image type="content" source="media/howto-configure-adx-endpoint/adx-advanced.png" alt-text="Screenshot using operations experience portal to set Azure Data Explorer advanced settings.":::
224-
225-
# [Kubernetes](#tab/kubernetes)
226-
227163
Set the values in the dataflow endpoint custom resource.
228164

229165
```yaml
@@ -232,5 +168,3 @@ dataExplorerSettings:
232168
latencySeconds: 100
233169
maxMessages: 1000
234170
```
235-
236-
---

articles/iot-operations/connect-to-cloud/howto-configure-dataflow-endpoint.md

Lines changed: 2 additions & 35 deletions
Original file line numberDiff line numberDiff line change
@@ -38,7 +38,7 @@ For example, you can use the default MQTT broker dataflow endpoint. You can use
3838

3939
# [Portal](#tab/portal)
4040

41-
:::image type="content" source="media/howto-configure-dataflow-endpoint/create-dataflow-mq-mq.png" alt-text="Screenshot using operations experience portalportal to create a dataflow from MQTT to MQTT.":::
41+
:::image type="content" source="media/howto-configure-dataflow-endpoint/create-dataflow-mq-mq.png" alt-text="Screenshot using operations experience portal to create a dataflow from MQTT to MQTT.":::
4242

4343
# [Kubernetes](#tab/kubernetes)
4444

@@ -94,40 +94,7 @@ spec:
9494
9595
---
9696
97-
Similar to the MQTT example, you can create multiple dataflows that use the same Kafka endpoint for different topics, or the same Data Lake endpoint for different tables, and so on.
97+
Similar to the MQTT example, you can create multiple dataflows that use the same Kafka endpoint for different topics, or the same Data Lake endpoint for different tables.
9898
99-
## Manage dataflow endpoints
100-
101-
You can manage dataflow endpoints in the operations experience portal or by using the Kubernetes CLI.
102-
103-
:::image type="content" source="media/howto-configure-dataflow-endpoint/manage-dataflow-endpoints.png" alt-text="Screenshot using operations experience portal to view dataflow endpoint list.":::
104-
105-
106-
### View
107-
108-
You can view the health, metrics, configuration, and associated dataflows of an endpoint in the operations experience portal.
109-
110-
111-
<!-- TODO: link to relevant observability docs -->
112-
113-
### Edit
114-
115-
You can edit an endpoint in the operations experience portal. Be cautious if the endpoint is in use by a dataflow.
116-
117-
:::image type="content" source="media/howto-configure-dataflow-endpoint/edit-dataflow-endpoint.png" alt-text="Screenshot using operations experience portal to modify a dataflow":::
118-
119-
### Delete
120-
121-
You can delete an endpoint in the operations experience portal or using the `kubectl` command. Be cautious if the endpoint is in use by a dataflow.
122-
123-
# [Portal](#tab/portal)
124-
125-
:::image type="content" source="media/howto-configure-dataflow-endpoint/delete-dataflow-endpoint.png" alt-text="Screenshot using operations experience portal to delete a dataflow endpoint.":::
126-
127-
# [Kubernetes](#tab/kubernetes)
128-
129-
```bash
130-
kubectl delete dataflowendpoint my-endpoint
131-
```
13299
133100
---

0 commit comments

Comments
 (0)