You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/digital-twins/concepts-ontologies-adopt.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -25,8 +25,8 @@ Microsoft partnered with domain experts to create DTDL model sets based on indus
25
25
| --- | --- | --- | --- |
26
26
| Smart buildings | [Digital Twins Definition Language-based RealEstateCore ontology for smart buildings](https://github.com/Azure/opendigitaltwins-building) | Microsoft partnered with [RealEstateCore](https://www.realestatecore.io/) to deliver this open-source DTDL ontology for the real estate industry. [RealEstateCore](https://www.realestatecore.io/) is a consortium of real estate owners, software vendors, and research institutions.<br><br>This smart buildings ontology provides common ground for modeling smart buildings, using industry standards (like [BRICK Schema](https://ontology.brickschema.org/) or [W3C Building Topology Ontology](https://w3c-lbd-cg.github.io/bot/index.html)) to avoid reinvention. The ontology also comes with best practices for how to consume and properly extend it. | To learn more about the partnership with RealEstateCore and goals for this initiative, see the following blog post and embedded video: [RealEstateCore, a smart building ontology for digital twins, is now available](https://techcommunity.microsoft.com/t5/internet-of-things/realestatecore-a-smart-building-ontology-for-digital-twins-is/ba-p/1914794). |
27
27
| Smart cities |[Digital Twins Definition Language (DTDL) ontology for Smart Cities](https://github.com/Azure/opendigitaltwins-smartcities)| Microsoft collaborated with [Open Agile Smart Cities (OASC)](https://oascities.org/) and [Sirus](https://sirus.be/) to provide a DTDL-based ontology for smart cities, starting with [ETSI CIM NGSI-LD](https://www.etsi.org/committee/cim). | To learn more about the partnerships and approach for smart cities, see the following blog post and embedded video: [Smart Cities Ontology for Digital Twins](https://techcommunity.microsoft.com/t5/internet-of-things/smart-cities-ontology-for-digital-twins/ba-p/2166585). |
28
-
| Energy grids |[Digital Twins Definition Language (DTDL) ontology for Energy Grid](https://github.com/Azure/opendigitaltwins-energygrid/)| This ontology helps solution providers accelerate development of digital twin solutions for energy use cases like monitoring grid assets, outage and impact analysis, simulation, and predictive maintenance. Additionally, the ontology can be used to enable the digital transformation and modernization of the energy grid. It's adapted from the [Common Information Model (CIM)](https://cimug.ucaiug.org/), a global standard for energy grid assets management, power system operations modeling, and physical energy commodity market. | To learn more about the partnerships and approach for energy grids, see the following blog post: [Energy Grid Ontology for Digital Twins](https://techcommunity.microsoft.com/t5/internet-of-things/energy-grid-ontology-for-digital-twins-is-now-available/ba-p/2325134). |
29
-
| Manufacturing |[Manufacturing Ontologies](https://github.com/digitaltwinconsortium/ManufacturingOntologies)| These ontologies help solution providers accelerate development of digital twin solutions for manufacturing use cases like asset condition monitoring, simulation, OEE calculation, and predictive maintenance. Additionally, the ontologies can be used to enable the digital transformation and modernization of factories and plants. They're adapted from [OPC UA](https://opcfoundation.org), [ISA95](https://www.isa.org/standards-and-publications/isa-standards/isa-standards-committees/isa95) and the [Asset Administration Shell](https://reference.opcfoundation.org/I4AAS/v100/docs/4.1), three global standards widely used in the manufacturing space. | Visit the repository to learn more about this ontology and explore a sample solution for ingesting OPC UA data into Azure Digital Twins. |
28
+
| Energy grids |[Digital Twins Definition Language (DTDL) ontology for Energy Grid](https://github.com/Azure/opendigitaltwins-energygrid/)| This ontology helps solution providers accelerate development of digital twin solutions for energy use cases like monitoring grid assets, outage and impact analysis, simulation, and predictive maintenance. Additionally, the ontology can be used to enable the digital transformation and modernization of the energy grid. It's adapted from the [Common Information Model (CIM)](https://cimug.ucaiug.org/), which is a global standard for energy grid assets management, power system operations modeling, and physical energy commodity market. | To learn more about the partnerships and approach for energy grids, see the following blog post: [Energy Grid Ontology for Digital Twins](https://techcommunity.microsoft.com/t5/internet-of-things/energy-grid-ontology-for-digital-twins-is-now-available/ba-p/2325134). |
29
+
| Manufacturing |[Manufacturing Ontologies](https://github.com/digitaltwinconsortium/ManufacturingOntologies)| These ontologies help solution providers accelerate development of digital twin solutions for manufacturing use cases like asset condition monitoring, simulation, OEE calculation, and predictive maintenance. Additionally, the ontologies can be used to enable the digital transformation and modernization of factories and plants. They're adapted from [OPC UA](https://opcfoundation.org), [ISA95](https://www.isa.org/standards-and-publications/isa-standards/isa-standards-committees/isa95), and the [Asset Administration Shell](https://reference.opcfoundation.org/I4AAS/v100/docs/4.1), three global standards widely used in the manufacturing space. | Visit the repository to learn more about this ontology and explore a sample solution for ingesting OPC UA data into Azure Digital Twins. |
> While you can place the client ID, tenant ID, and instance URL directly into the code as shown in the previous example, it's a good idea to have your code get these values from a configuration file or environment variable instead.
105
+
> Although the example above places the client ID, tenant ID, and instance URL directly into the code, it's a good idea to get these values from a configuration file or environment variable instead.
Copy file name to clipboardExpand all lines: articles/digital-twins/how-to-create-endpoints.md
+7-7Lines changed: 7 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -18,7 +18,7 @@ ms.custom: devx-track-azurecli
18
18
19
19
This article explains how to create an *endpoint* for Azure Digital Twin events using the [Azure portal](https://portal.azure.com) or the [Azure CLI](/cli/azure/dt/endpoint). You can also manage endpoints with the [DigitalTwinsEndpoint control plane APIs](/rest/api/digital-twins/controlplane/endpoints).
20
20
21
-
Routing [event notifications](concepts-event-notifications.md) from Azure Digital Twins to downstream services or connected compute resources is a two-step process: create endpoints, then create event routes to send data to those endpoints. This article covers the first step, setting up endpoints that can receive the events. Later, you can create [event routes](how-to-create-routes.md) that specify which events generated by Azure Digital Twins are delivered to which endpoints.
21
+
Routing [event notifications](concepts-event-notifications.md) from Azure Digital Twins to downstream services or connected compute resources is a two-step process: create endpoints, then create event routes that send data to those endpoints. This article covers the first step, setting up endpoints that can receive the events. Later, you can create [event routes](how-to-create-routes.md) that specify which events generated by Azure Digital Twins are delivered to which endpoints.
22
22
23
23
## Prerequisites
24
24
@@ -69,7 +69,7 @@ To create a new endpoint, go to your instance's page in the [Azure portal](https
69
69
1. Enter a **Name** for your endpoint and choose the **Endpoint type**.
70
70
71
71
1. Complete the other details that are required for your endpoint type, including your subscription and the endpoint resources described [earlier](#create-required-resources).
72
-
1. For Event Hubs and Service Bus endpoints only, select an **Authentication type**. You can use key-based authentication with a pre-created authorization rule, or a system-assigned or user-assigned managed identity. For more information about using the identity authentication options, see [Endpoint options: Identity-based authentication](#endpoint-options-identity-based-authentication).
72
+
1. For Event Hubs and Service Bus endpoints only, select an **Authentication type**. You can use key-based authentication with a precreated authorization rule, or a system-assigned or user-assigned managed identity. For more information about using the identity authentication options, see [Endpoint options: Identity-based authentication](#endpoint-options-identity-based-authentication).
73
73
74
74
:::image type="content" source="media/how-to-create-endpoints/create-endpoint-event-hub-authentication.png" alt-text="Screenshot of creating an endpoint of type Event Hubs in the Azure portal." lightbox="media/how-to-create-endpoints/create-endpoint-event-hub-authentication.png":::
75
75
1. Finish creating your endpoint by selecting **Save**.
### Considerations for disabling managed identities
231
231
232
-
Because an identity is managed separately from the endpoints that use it, it's important to consider the effects that any changes to the identity or its roles can have on the endpoints in your Azure Digital Twins instance. If you disable the identity or remove a necessary role for an endpoint, the endpoint becomes inaccessible and the flow of events is disrupted.
232
+
An identity is managed separately from the endpoints that use it. Because of this fact, it's important to consider how any change to the identity or its roles can affect the endpoints in your Azure Digital Twins instance. If you disable the identity or remove a necessary role for an endpoint, the endpoint becomes inaccessible, and the flow of events is disrupted.
233
233
234
234
To continue using an endpoint that was set up with a managed identity that you disabled, delete the endpoint and [re-create it](#create-the-endpoint) with a different authentication type. It might take up to an hour for events to resume delivery to the endpoint after this change.
235
235
236
236
## Endpoint options: Dead-lettering
237
237
238
-
When an endpoint can't deliver an event within a certain time period or after trying to deliver the event a certain number of times, it can send the undelivered event to a storage account. This process is known as *dead-lettering*.
238
+
When an endpoint can't deliver an event within a certain time period or number of tries, it can send the undelivered event to a storage account. This process is known as *dead-lettering*.
239
239
240
240
You can set up the necessary storage resources using the [Azure portal](https://portal.azure.com/#home) or the [Azure Digital Twins CLI](/cli/azure/dt). However, to create an endpoint with dead-lettering enabled, you need to use the [Azure Digital Twins CLI](/cli/azure/dt) or [control plane APIs](concepts-apis-sdks.md#control-plane-overview).
241
241
@@ -247,10 +247,10 @@ Before setting the dead-letter location, you must have a [storage account](../st
247
247
248
248
Provide the URI for this container when creating the endpoint. The dead-letter location is provided to the endpoint as a container URI with a [SAS token](../storage/common/storage-sas-overview.md). That token needs `write` permission for the destination container within the storage account. The fully formed dead letter SAS URI is in the format of: `https://<storage-account-name>.blob.core.windows.net/<container-name>?<SAS-token>`.
249
249
250
-
Follow the steps in the following section to set up these storage resources in your Azure account. After setting up the storage resources, you can set up the endpoint connection.
250
+
To set up these storage resources in your Azure account, follow the steps in the following section. After setting up the storage resources, you can set up the endpoint connection.
251
251
252
-
1.Follow the steps in [Create a storage account](../storage/common/storage-account-create.md?tabs=azure-portal) to create a storage account in your Azure subscription. Make a note of the storage account name to use later.
253
-
1.Follow the steps in [Create a container](../storage/blobs/storage-quickstart-blobs-portal.md#create-a-container) to create a container within the new storage account. Make a note of the container name to use later.
252
+
1.To create a storage account in your Azure subscription, follow the steps in [Create a storage account](../storage/common/storage-account-create.md?tabs=azure-portal). Make a note of the storage account name so you can use it later.
253
+
1.To create a container within the new storage account, follow the steps in [Create a container](../storage/blobs/storage-quickstart-blobs-portal.md#create-a-container). Make a note of the container name so you can use it later.
Copy file name to clipboardExpand all lines: articles/iot-operations/get-started-end-to-end-sample/quickstart-get-insights.md
+4-4Lines changed: 4 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -27,7 +27,7 @@ You also need to meet the following Fabric requirements:
27
27
28
28
## What problem will we solve?
29
29
30
-
When your OPC UA data arrives in the cloud, you have a lot of information available to analyze. You might want to organize that data and create reports containing graphs and visualizations to derive insights from the data. The steps in this quickstart illustrate how to connect that data to Real-Time Intelligence and create a Real-Time Dashboard.
30
+
When your OPC UA data arrives in the cloud, there's a lot of information available to analyze. You might want to organize that data and create reports containing graphs and visualizations to derive insights from the data. The steps in this quickstart illustrate how to connect that data to Real-Time Intelligence and create a Real-Time Dashboard.
31
31
32
32
## Ingest data into Real-Time Intelligence
33
33
@@ -86,7 +86,7 @@ In this section, you create a KQL database in your Microsoft Fabric workspace to
86
86
87
87
:::image type="content" source="media/quickstart-get-insights/query-with-code.png" alt-text="Screenshot showing the Query with code button.":::
88
88
89
-
1. Clear the sample query, and run the following KQL query to create a data mapping for your table. The data mapping is called *opcua_mapping*.
89
+
1. Clear the sample query, and run the following KQL query that creates a data mapping for your table. The data mapping is called *opcua_mapping*.
@@ -128,7 +128,7 @@ Then, follow the steps to upload the dashboard template and connect it to your d
128
128
1. In your Real-Time Dashboard, switch to the **Manage** tab and select **Replace with file**.
129
129
:::image type="content" source="media/quickstart-get-insights/dashboard-upload-replace.png" alt-text="Screenshot of the buttons to upload a file template.":::
130
130
1. Select the template file that you downloaded to your machine.
131
-
1. The template file populates the dashboard with multiple tiles, although the tiles can't get data since you haven't yet connected your data source.
131
+
1. The template file populates the dashboard with multiple tiles, although the tiles can't get data because you haven't connected a data source yet.
132
132
:::image type="content" source="media/quickstart-get-insights/dashboard-upload-errors.png" alt-text="Screenshot of the dashboard with errors in the visuals.":::
133
133
1. From the **Manage** tab, select **Data sources**. This action opens the **Data sources** pane with a sample source for your AIO data. Select the pencil icon to edit the *AIOdata* data source.
134
134
:::image type="content" source="media/quickstart-get-insights/dashboard-data-sources.png" alt-text="Screenshot of the buttons to connect a data source.":::
@@ -143,7 +143,7 @@ On the **Home** tab, select **Save** to save your dashboard.
143
143
### Explore dashboard
144
144
145
145
You now have a dashboard that displays different types of visuals for the asset data in these quickstarts. The visuals included with the template are:
146
-
* Parameters for your dashboard that allow all visuals to be filtered by timestamp (included by default) and asset ID.
146
+
* Parameters for your dashboard that enable filtering of all visuals by timestamp (included by default) and asset ID.
147
147
* A line chart tile showing temperature and its spikes over time.
148
148
* A stat tile showing a real-time spike indicator for temperature. The tile displays the most recent temperature value, and if that value is a spike, conditional formatting displays it as a warning.
0 commit comments