Skip to content

Commit aba79d0

Browse files
committed
Manual Acrolinx improvements
1 parent 7d1a84b commit aba79d0

File tree

4 files changed

+14
-14
lines changed

4 files changed

+14
-14
lines changed

articles/digital-twins/concepts-ontologies-adopt.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -25,8 +25,8 @@ Microsoft partnered with domain experts to create DTDL model sets based on indus
2525
| --- | --- | --- | --- |
2626
| Smart buildings | [Digital Twins Definition Language-based RealEstateCore ontology for smart buildings](https://github.com/Azure/opendigitaltwins-building) | Microsoft partnered with [RealEstateCore](https://www.realestatecore.io/) to deliver this open-source DTDL ontology for the real estate industry. [RealEstateCore](https://www.realestatecore.io/) is a consortium of real estate owners, software vendors, and research institutions.<br><br>This smart buildings ontology provides common ground for modeling smart buildings, using industry standards (like [BRICK Schema](https://ontology.brickschema.org/) or [W3C Building Topology Ontology](https://w3c-lbd-cg.github.io/bot/index.html)) to avoid reinvention. The ontology also comes with best practices for how to consume and properly extend it. | To learn more about the partnership with RealEstateCore and goals for this initiative, see the following blog post and embedded video: [RealEstateCore, a smart building ontology for digital twins, is now available](https://techcommunity.microsoft.com/t5/internet-of-things/realestatecore-a-smart-building-ontology-for-digital-twins-is/ba-p/1914794). |
2727
| Smart cities | [Digital Twins Definition Language (DTDL) ontology for Smart Cities](https://github.com/Azure/opendigitaltwins-smartcities) | Microsoft collaborated with [Open Agile Smart Cities (OASC)](https://oascities.org/) and [Sirus](https://sirus.be/) to provide a DTDL-based ontology for smart cities, starting with [ETSI CIM NGSI-LD](https://www.etsi.org/committee/cim). | To learn more about the partnerships and approach for smart cities, see the following blog post and embedded video: [Smart Cities Ontology for Digital Twins](https://techcommunity.microsoft.com/t5/internet-of-things/smart-cities-ontology-for-digital-twins/ba-p/2166585). |
28-
| Energy grids | [Digital Twins Definition Language (DTDL) ontology for Energy Grid](https://github.com/Azure/opendigitaltwins-energygrid/) | This ontology helps solution providers accelerate development of digital twin solutions for energy use cases like monitoring grid assets, outage and impact analysis, simulation, and predictive maintenance. Additionally, the ontology can be used to enable the digital transformation and modernization of the energy grid. It's adapted from the [Common Information Model (CIM)](https://cimug.ucaiug.org/), a global standard for energy grid assets management, power system operations modeling, and physical energy commodity market. | To learn more about the partnerships and approach for energy grids, see the following blog post: [Energy Grid Ontology for Digital Twins](https://techcommunity.microsoft.com/t5/internet-of-things/energy-grid-ontology-for-digital-twins-is-now-available/ba-p/2325134). |
29-
| Manufacturing | [Manufacturing Ontologies](https://github.com/digitaltwinconsortium/ManufacturingOntologies) | These ontologies help solution providers accelerate development of digital twin solutions for manufacturing use cases like asset condition monitoring, simulation, OEE calculation, and predictive maintenance. Additionally, the ontologies can be used to enable the digital transformation and modernization of factories and plants. They're adapted from [OPC UA](https://opcfoundation.org), [ISA95](https://www.isa.org/standards-and-publications/isa-standards/isa-standards-committees/isa95) and the [Asset Administration Shell](https://reference.opcfoundation.org/I4AAS/v100/docs/4.1), three global standards widely used in the manufacturing space. | Visit the repository to learn more about this ontology and explore a sample solution for ingesting OPC UA data into Azure Digital Twins. |
28+
| Energy grids | [Digital Twins Definition Language (DTDL) ontology for Energy Grid](https://github.com/Azure/opendigitaltwins-energygrid/) | This ontology helps solution providers accelerate development of digital twin solutions for energy use cases like monitoring grid assets, outage and impact analysis, simulation, and predictive maintenance. Additionally, the ontology can be used to enable the digital transformation and modernization of the energy grid. It's adapted from the [Common Information Model (CIM)](https://cimug.ucaiug.org/), which is a global standard for energy grid assets management, power system operations modeling, and physical energy commodity market. | To learn more about the partnerships and approach for energy grids, see the following blog post: [Energy Grid Ontology for Digital Twins](https://techcommunity.microsoft.com/t5/internet-of-things/energy-grid-ontology-for-digital-twins-is-now-available/ba-p/2325134). |
29+
| Manufacturing | [Manufacturing Ontologies](https://github.com/digitaltwinconsortium/ManufacturingOntologies) | These ontologies help solution providers accelerate development of digital twin solutions for manufacturing use cases like asset condition monitoring, simulation, OEE calculation, and predictive maintenance. Additionally, the ontologies can be used to enable the digital transformation and modernization of factories and plants. They're adapted from [OPC UA](https://opcfoundation.org), [ISA95](https://www.isa.org/standards-and-publications/isa-standards/isa-standards-committees/isa95), and the [Asset Administration Shell](https://reference.opcfoundation.org/I4AAS/v100/docs/4.1), three global standards widely used in the manufacturing space. | Visit the repository to learn more about this ontology and explore a sample solution for ingesting OPC UA data into Azure Digital Twins. |
3030

3131
## Next steps
3232

articles/digital-twins/how-to-authenticate-client.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -102,7 +102,7 @@ Here's an example of the code to create an authenticated SDK client using `Inter
102102
:::code language="csharp" source="~/digital-twins-docs-samples/sdks/csharp/authentication.cs" id="InteractiveBrowserCredential":::
103103

104104
>[!NOTE]
105-
> While you can place the client ID, tenant ID, and instance URL directly into the code as shown in the previous example, it's a good idea to have your code get these values from a configuration file or environment variable instead.
105+
> Although the example above places the client ID, tenant ID, and instance URL directly into the code, it's a good idea to get these values from a configuration file or environment variable instead.
106106
107107
## Authenticate Azure Functions
108108

articles/digital-twins/how-to-create-endpoints.md

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@ ms.custom: devx-track-azurecli
1818

1919
This article explains how to create an *endpoint* for Azure Digital Twin events using the [Azure portal](https://portal.azure.com) or the [Azure CLI](/cli/azure/dt/endpoint). You can also manage endpoints with the [DigitalTwinsEndpoint control plane APIs](/rest/api/digital-twins/controlplane/endpoints).
2020

21-
Routing [event notifications](concepts-event-notifications.md) from Azure Digital Twins to downstream services or connected compute resources is a two-step process: create endpoints, then create event routes to send data to those endpoints. This article covers the first step, setting up endpoints that can receive the events. Later, you can create [event routes](how-to-create-routes.md) that specify which events generated by Azure Digital Twins are delivered to which endpoints.
21+
Routing [event notifications](concepts-event-notifications.md) from Azure Digital Twins to downstream services or connected compute resources is a two-step process: create endpoints, then create event routes that send data to those endpoints. This article covers the first step, setting up endpoints that can receive the events. Later, you can create [event routes](how-to-create-routes.md) that specify which events generated by Azure Digital Twins are delivered to which endpoints.
2222

2323
## Prerequisites
2424

@@ -69,7 +69,7 @@ To create a new endpoint, go to your instance's page in the [Azure portal](https
6969
1. Enter a **Name** for your endpoint and choose the **Endpoint type**.
7070

7171
1. Complete the other details that are required for your endpoint type, including your subscription and the endpoint resources described [earlier](#create-required-resources).
72-
1. For Event Hubs and Service Bus endpoints only, select an **Authentication type**. You can use key-based authentication with a pre-created authorization rule, or a system-assigned or user-assigned managed identity. For more information about using the identity authentication options, see [Endpoint options: Identity-based authentication](#endpoint-options-identity-based-authentication).
72+
1. For Event Hubs and Service Bus endpoints only, select an **Authentication type**. You can use key-based authentication with a precreated authorization rule, or a system-assigned or user-assigned managed identity. For more information about using the identity authentication options, see [Endpoint options: Identity-based authentication](#endpoint-options-identity-based-authentication).
7373

7474
:::image type="content" source="media/how-to-create-endpoints/create-endpoint-event-hub-authentication.png" alt-text="Screenshot of creating an endpoint of type Event Hubs in the Azure portal." lightbox="media/how-to-create-endpoints/create-endpoint-event-hub-authentication.png":::
7575
1. Finish creating your endpoint by selecting **Save**.
@@ -229,13 +229,13 @@ az dt endpoint create eventhub --endpoint-name <endpoint-name> --eventhub-resour
229229

230230
### Considerations for disabling managed identities
231231

232-
Because an identity is managed separately from the endpoints that use it, it's important to consider the effects that any changes to the identity or its roles can have on the endpoints in your Azure Digital Twins instance. If you disable the identity or remove a necessary role for an endpoint, the endpoint becomes inaccessible and the flow of events is disrupted.
232+
An identity is managed separately from the endpoints that use it. Because of this fact, it's important to consider how any change to the identity or its roles can affect the endpoints in your Azure Digital Twins instance. If you disable the identity or remove a necessary role for an endpoint, the endpoint becomes inaccessible, and the flow of events is disrupted.
233233

234234
To continue using an endpoint that was set up with a managed identity that you disabled, delete the endpoint and [re-create it](#create-the-endpoint) with a different authentication type. It might take up to an hour for events to resume delivery to the endpoint after this change.
235235

236236
## Endpoint options: Dead-lettering
237237

238-
When an endpoint can't deliver an event within a certain time period or after trying to deliver the event a certain number of times, it can send the undelivered event to a storage account. This process is known as *dead-lettering*.
238+
When an endpoint can't deliver an event within a certain time period or number of tries, it can send the undelivered event to a storage account. This process is known as *dead-lettering*.
239239

240240
You can set up the necessary storage resources using the [Azure portal](https://portal.azure.com/#home) or the [Azure Digital Twins CLI](/cli/azure/dt). However, to create an endpoint with dead-lettering enabled, you need to use the [Azure Digital Twins CLI](/cli/azure/dt) or [control plane APIs](concepts-apis-sdks.md#control-plane-overview).
241241

@@ -247,10 +247,10 @@ Before setting the dead-letter location, you must have a [storage account](../st
247247

248248
Provide the URI for this container when creating the endpoint. The dead-letter location is provided to the endpoint as a container URI with a [SAS token](../storage/common/storage-sas-overview.md). That token needs `write` permission for the destination container within the storage account. The fully formed dead letter SAS URI is in the format of: `https://<storage-account-name>.blob.core.windows.net/<container-name>?<SAS-token>`.
249249

250-
Follow the steps in the following section to set up these storage resources in your Azure account. After setting up the storage resources, you can set up the endpoint connection.
250+
To set up these storage resources in your Azure account, follow the steps in the following section. After setting up the storage resources, you can set up the endpoint connection.
251251

252-
1. Follow the steps in [Create a storage account](../storage/common/storage-account-create.md?tabs=azure-portal) to create a storage account in your Azure subscription. Make a note of the storage account name to use later.
253-
1. Follow the steps in [Create a container](../storage/blobs/storage-quickstart-blobs-portal.md#create-a-container) to create a container within the new storage account. Make a note of the container name to use later.
252+
1. To create a storage account in your Azure subscription, follow the steps in [Create a storage account](../storage/common/storage-account-create.md?tabs=azure-portal). Make a note of the storage account name so you can use it later.
253+
1. To create a container within the new storage account, follow the steps in [Create a container](../storage/blobs/storage-quickstart-blobs-portal.md#create-a-container). Make a note of the container name so you can use it later.
254254

255255
#### Create a SAS token
256256

articles/iot-operations/get-started-end-to-end-sample/quickstart-get-insights.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -27,7 +27,7 @@ You also need to meet the following Fabric requirements:
2727

2828
## What problem will we solve?
2929

30-
When your OPC UA data arrives in the cloud, you have a lot of information available to analyze. You might want to organize that data and create reports containing graphs and visualizations to derive insights from the data. The steps in this quickstart illustrate how to connect that data to Real-Time Intelligence and create a Real-Time Dashboard.
30+
When your OPC UA data arrives in the cloud, there's a lot of information available to analyze. You might want to organize that data and create reports containing graphs and visualizations to derive insights from the data. The steps in this quickstart illustrate how to connect that data to Real-Time Intelligence and create a Real-Time Dashboard.
3131

3232
## Ingest data into Real-Time Intelligence
3333

@@ -86,7 +86,7 @@ In this section, you create a KQL database in your Microsoft Fabric workspace to
8686

8787
:::image type="content" source="media/quickstart-get-insights/query-with-code.png" alt-text="Screenshot showing the Query with code button.":::
8888

89-
1. Clear the sample query, and run the following KQL query to create a data mapping for your table. The data mapping is called *opcua_mapping*.
89+
1. Clear the sample query, and run the following KQL query that creates a data mapping for your table. The data mapping is called *opcua_mapping*.
9090

9191
```kql
9292
.create table ['OPCUA'] ingestion json mapping 'opcua_mapping' '[{"column":"AssetId", "Properties":{"Path":"$[\'AssetId\']"}},{"column":"Spike", "Properties":{"Path":"$.Spike"}},{"column":"Temperature", "Properties":{"Path":"$.TemperatureF"}},{"column":"FillWeight", "Properties":{"Path":"$.FillWeight"}},{"column":"EnergyUse", "Properties":{"Path":"$.EnergyUse.Value"}},{"column":"Timestamp", "Properties":{"Path":"$[\'EventProcessedUtcTime\']"}}]'
@@ -128,7 +128,7 @@ Then, follow the steps to upload the dashboard template and connect it to your d
128128
1. In your Real-Time Dashboard, switch to the **Manage** tab and select **Replace with file**.
129129
:::image type="content" source="media/quickstart-get-insights/dashboard-upload-replace.png" alt-text="Screenshot of the buttons to upload a file template.":::
130130
1. Select the template file that you downloaded to your machine.
131-
1. The template file populates the dashboard with multiple tiles, although the tiles can't get data since you haven't yet connected your data source.
131+
1. The template file populates the dashboard with multiple tiles, although the tiles can't get data because you haven't connected a data source yet.
132132
:::image type="content" source="media/quickstart-get-insights/dashboard-upload-errors.png" alt-text="Screenshot of the dashboard with errors in the visuals.":::
133133
1. From the **Manage** tab, select **Data sources**. This action opens the **Data sources** pane with a sample source for your AIO data. Select the pencil icon to edit the *AIOdata* data source.
134134
:::image type="content" source="media/quickstart-get-insights/dashboard-data-sources.png" alt-text="Screenshot of the buttons to connect a data source.":::
@@ -143,7 +143,7 @@ On the **Home** tab, select **Save** to save your dashboard.
143143
### Explore dashboard
144144
145145
You now have a dashboard that displays different types of visuals for the asset data in these quickstarts. The visuals included with the template are:
146-
* Parameters for your dashboard that allow all visuals to be filtered by timestamp (included by default) and asset ID.
146+
* Parameters for your dashboard that enable filtering of all visuals by timestamp (included by default) and asset ID.
147147
* A line chart tile showing temperature and its spikes over time.
148148
* A stat tile showing a real-time spike indicator for temperature. The tile displays the most recent temperature value, and if that value is a spike, conditional formatting displays it as a warning.
149149
* A stat tile showing max temperature.

0 commit comments

Comments
 (0)