Skip to content

Commit 68db5db

Browse files
committed
operations experience naming consistency
1 parent 35f5adc commit 68db5db

10 files changed

+18
-18
lines changed

articles/iot-operations/connect-to-cloud/concept-schema-registry.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -99,7 +99,7 @@ Asset sources have a predefined message schema that was created by the connector
9999

100100
Schemas can be uploaded for message broker sources. Currently, Azure IoT Operations supports JSON for source schemas, also known as input schemas. In the operations experience, you can select an existing schema or upload one while defining a message broker source:
101101

102-
:::image type="content" source="./media/concept-schema-registry/upload-schema.png" alt-text="Screenshot that shows uploading a message schema in the operations experience portal.":::
102+
:::image type="content" source="./media/concept-schema-registry/upload-schema.png" alt-text="Screenshot that shows uploading a message schema in the operations experience web UI.":::
103103

104104
### Transformation
105105

@@ -109,7 +109,7 @@ The operations experience uses the input schema as a starting point for your dat
109109

110110
Output schemas are associated with data flow destinations.
111111

112-
In the operations experience portal, you can configure output schemas for the following destination endpoints that support Parquet output:
112+
In the operations experience web UI, you can configure output schemas for the following destination endpoints that support Parquet output:
113113

114114
* local storage
115115
* Fabric OneLake
@@ -126,7 +126,7 @@ To upload an output schema, see [Upload schema](#upload-schema).
126126

127127
## Upload schema
128128

129-
Input schema can be uploaded in the operations experience portal as described in the [Input schema](#input-schema) section of this article. You can also upload a schema using the Azure CLI or a Bicep template.
129+
Input schema can be uploaded in the operations experience web UI as described in the [Input schema](#input-schema) section of this article. You can also upload a schema using the Azure CLI or a Bicep template.
130130

131131
### Upload schema with the CLI
132132

articles/iot-operations/connect-to-cloud/howto-configure-adlsv2-endpoint.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -138,7 +138,7 @@ Then, create the *DataflowEndpoint* resource and specify the access token authen
138138

139139
# [Portal](#tab/portal)
140140

141-
See the [access token](#access-token) section for steps to create a secret in the operations experience portal.
141+
See the [access token](#access-token) section for steps to create a secret in the operations experience web UI.
142142

143143
# [Bicep](#tab/bicep)
144144

@@ -355,7 +355,7 @@ To enhance security and follow the principle of least privilege, you can generat
355355
# [Portal](#tab/portal)
356356

357357
> [!IMPORTANT]
358-
> To use the operations experience portal to manage secrets, Azure IoT Operations must first be enabled with secure settings by configuring an Azure Key Vault and enabling workload identities. To learn more, see [Enable secure settings in Azure IoT Operations deployment](../deploy-iot-ops/howto-enable-secure-settings.md).
358+
> To use the operations experience web UI to manage secrets, Azure IoT Operations must first be enabled with secure settings by configuring an Azure Key Vault and enabling workload identities. To learn more, see [Enable secure settings in Azure IoT Operations deployment](../deploy-iot-ops/howto-enable-secure-settings.md).
359359

360360
In the operations experience data flow endpoint settings page, select the **Basic** tab then choose **Authentication method** > **Access token**.
361361

articles/iot-operations/connect-to-cloud/howto-configure-dataflow-profile.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,7 @@ The most important setting is the instance count, which determines the number of
2424
By default, a data flow profile named *default* is created when Azure IoT Operations is deployed. This data flow profile has a single instance count. You can use this data flow profile to get started with Azure IoT Operations.
2525

2626
> [!IMPORTANT]
27-
> Currently, the default data flow profile is the only profile supported by the [operations experience portal](https://iotoperations.azure.com/). All data flows created using the operations experience portal use the default data flow profile.
27+
> Currently, the default data flow profile is the only profile supported by the [operations experience web UI](https://iotoperations.azure.com/). All data flows created using the operations experience use the default data flow profile.
2828
2929
# [Bicep](#tab/bicep)
3030

articles/iot-operations/connect-to-cloud/howto-configure-fabric-real-time-intelligence.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -45,7 +45,7 @@ The connection string with the primary key.
4545

4646
To configure a data flow endpoint for Microsoft Fabric Real-Time Intelligence, you need to use Simple Authentication and Security Layer (SASL) based authentication.
4747

48-
Azure Key Vault is the recommended way to sync the connection string to the Kubernetes cluster so that it can be referenced in the data flow. [Secure settings](../deploy-iot-ops/howto-enable-secure-settings.md) must be enabled to configure this endpoint using the operations experience Portal.
48+
Azure Key Vault is the recommended way to sync the connection string to the Kubernetes cluster so that it can be referenced in the data flow. [Secure settings](../deploy-iot-ops/howto-enable-secure-settings.md) must be enabled to configure this endpoint using the operations experience web UI.
4949

5050
# [Portal](#tab/portal)
5151

articles/iot-operations/connect-to-cloud/howto-configure-kafka-endpoint.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -151,7 +151,7 @@ kubectl apply -f <FILE>.yaml
151151
# [Portal](#tab/portal)
152152

153153
> [!IMPORTANT]
154-
> To use the operations experience portal to manage secrets, Azure IoT Operations must first be enabled with secure settings by configuring an Azure Key Vault and enabling workload identities. To learn more, see [Enable secure settings in Azure IoT Operations deployment](../deploy-iot-ops/howto-enable-secure-settings.md).
154+
> To use the operations experience web UI to manage secrets, Azure IoT Operations must first be enabled with secure settings by configuring an Azure Key Vault and enabling workload identities. To learn more, see [Enable secure settings in Azure IoT Operations deployment](../deploy-iot-ops/howto-enable-secure-settings.md).
155155
156156
In the operations experience data flow endpoint settings page, select the **Basic** tab then choose **Authentication method** > **SASL**.
157157

articles/iot-operations/connect-to-cloud/howto-configure-mqtt-endpoint.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -598,10 +598,10 @@ Before configuring the data flow endpoint, create a secret with the certificate
598598
# [Portal](#tab/portal)
599599

600600
> [!IMPORTANT]
601-
> To use the operations experience portal to manage secrets, Azure IoT Operations must first be enabled with secure settings by configuring an Azure Key Vault and enabling workload identities. To learn more, see [Enable secure settings in Azure IoT Operations deployment](../deploy-iot-ops/howto-enable-secure-settings.md).
601+
> To use the operations experience web UI to manage secrets, Azure IoT Operations must first be enabled with secure settings by configuring an Azure Key Vault and enabling workload identities. To learn more, see [Enable secure settings in Azure IoT Operations deployment](../deploy-iot-ops/howto-enable-secure-settings.md).
602602

603603
> [!IMPORTANT]
604-
> The operations experience portal currently has a known issue where creating an X.509 secret results in a secret with incorrectly encoded data. To learn more and the workaround, see [known issues](../troubleshoot/known-issues.md).
604+
> The operations experience web UI currently has a known issue where creating an X.509 secret results in a secret with incorrectly encoded data. To learn more and the workaround, see [known issues](../troubleshoot/known-issues.md).
605605

606606
In the operations experience data flow endpoint settings page, select the **Basic** tab then choose **Authentication method** > **X509 certificate**.
607607

articles/iot-operations/connect-to-cloud/howto-create-dataflow.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@ ai-usage: ai-assisted
1616

1717
[!INCLUDE [kubernetes-management-preview-note](../includes/kubernetes-management-preview-note.md)]
1818

19-
A data flow is the path that data takes from the source to the destination with optional transformations. You can configure the data flow by creating a *Data flow* custom resource or using the Azure IoT Operations Studio portal. A data flow is made up of three parts: the **source**, the **transformation**, and the **destination**.
19+
A data flow is the path that data takes from the source to the destination with optional transformations. You can configure the data flow by creating a *Data flow* custom resource or using the operations experience web UI. A data flow is made up of three parts: the **source**, the **transformation**, and the **destination**.
2020

2121
<!--
2222
```mermaid
@@ -447,7 +447,7 @@ sourceSettings:
447447

448448
### Specify source schema
449449

450-
When using MQTT or Kafka as the source, you can specify a [schema](concept-schema-registry.md) to display the list of data points in the operations experience portal. Using a schema to deserialize and validate incoming messages [isn't currently supported](../troubleshoot/known-issues.md#data-flows).
450+
When using MQTT or Kafka as the source, you can specify a [schema](concept-schema-registry.md) to display the list of data points in the operations experience web UI. Using a schema to deserialize and validate incoming messages [isn't currently supported](../troubleshoot/known-issues.md#data-flows).
451451

452452
If the source is an asset, the schema is automatically inferred from the asset definition.
453453

articles/iot-operations/end-to-end-tutorials/tutorial-upload-telemetry-to-cloud.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -98,15 +98,15 @@ az role assignment create --role "Azure Event Hubs Data Sender" --assignee $PRIN
9898

9999
## Create a data flow to send telemetry to an event hub
100100

101-
Use the operations experience UI to create and configure a data flow in your cluster that:
101+
Use the operations experience web UI to create and configure a data flow in your cluster that:
102102

103103
- Renames the `temperature` field in the incoming message to `TemperatureF`.
104104
- Adds a field called `AssetId` that contains the name of the asset.
105105
- Forwards the transformed messages from the MQTT topic to the event hub you created.
106106

107107
To create the data flow:
108108

109-
1. Browse to the operations experience UI and locate your instance. Then select **Data flow endpoints** and select **+ New** in the **Azure Event Hubs** tile:
109+
1. Browse to the operations experience web UI and locate your instance. Then select **Data flow endpoints** and select **+ New** in the **Azure Event Hubs** tile:
110110

111111
:::image type="content" source="media/tutorial-upload-telemetry-to-cloud/new-event-hubs-endpoint.png" alt-text="Screenshot of the data flow endpoints page.":::
112112

articles/iot-operations/secure-iot-ops/howto-manage-secrets.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -50,6 +50,6 @@ You can use **Manage secrets** for asset endpoints and data flow endpoints to ma
5050
You can delete synced secrets as well in manage secrets. When you delete a synced secret, it only deletes the synced secret from the edge, and doesn't delete the contained secret reference from key vault.
5151

5252
> [!WARNING]
53-
> Directly editing **SecretProviderClass** and **SecretSync** custom resources in your Kubernetes cluster can break the secrets flow in Azure IoT Operations. For any operations related to secrets, use the operations experience UI.
53+
> Directly editing **SecretProviderClass** and **SecretSync** custom resources in your Kubernetes cluster can break the secrets flow in Azure IoT Operations. For any operations related to secrets, use the operations experience web UI.
5454
>
5555
> Before deleting a synced secret, make sure that all references to the secret from Azure IoT Operations components are removed.

articles/iot-operations/troubleshoot/known-issues.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -111,14 +111,14 @@ kubectl delete pod aio-opc-opc.tcp-1-f95d76c54-w9v9c -n azure-iot-operations
111111
112112
## Data flows
113113
114-
- Data flow custom resources created in your cluster aren't visible in the operations experience UI. This is expected because [managing Azure IoT Operations components using Kubernetes is in preview](../deploy-iot-ops/howto-manage-update-uninstall.md#preview-manage-components-using-kubernetes-deployment-manifests), and synchronizing resources from the edge to the cloud isn't currently supported.
114+
- Data flow custom resources created in your cluster aren't visible in the operations experience web UI. This is expected because [managing Azure IoT Operations components using Kubernetes is in preview](../deploy-iot-ops/howto-manage-update-uninstall.md#preview-manage-components-using-kubernetes-deployment-manifests), and synchronizing resources from the edge to the cloud isn't currently supported.
115115
116116
- X.509 authentication for custom Kafka endpoints isn't supported yet.
117117

118-
- Deserializing and validating messages using a schema isn't supported yet. Specifying a schema in the source configuration only allows the operations experience portal to display the list of data points, but the data points aren't validated against the schema.
118+
- Deserializing and validating messages using a schema isn't supported yet. Specifying a schema in the source configuration only allows the operations experience to display the list of data points, but the data points aren't validated against the schema.
119119

120120
<!-- TODO: double check -->
121-
- Creating an X.509 secret in the operations experience portal results in a secret with incorrectly encoded data. To work around this issue, create the [multi-line secrets through Azure Key Vault](/azure/key-vault/secrets/multiline-secrets), then select it from the list of secrets in the operations experience portal.
121+
- Creating an X.509 secret in the operations experience results in a secret with incorrectly encoded data. To work around this issue, create the [multi-line secrets through Azure Key Vault](/azure/key-vault/secrets/multiline-secrets), then select it from the list of secrets in the operations experience.
122122

123123
- When connecting multiple IoT Operations instances to the same Event Grid MQTT namespace, connection failures may occur due to client ID conflicts. Client IDs are currently derived from data flow resource names, and when using Infrastructure as Code (IaC) patterns for deployment, the generated client IDs may be identical. As a temporary workaround, add randomness to the data flow names in your deployment templates.
124124

0 commit comments

Comments
 (0)