Skip to content

Commit 6c17a21

Browse files
authored
Merge pull request #293684 from baanders/1-27-aio
AIO: Update Fabric flow
2 parents 1b205ba + f7a1977 commit 6c17a21

File tree

9 files changed

+29
-38
lines changed

9 files changed

+29
-38
lines changed
-36.5 KB
Loading
469 KB
Loading
292 KB
Loading
9.01 KB
Loading
9.29 KB
Loading

articles/iot-operations/get-started-end-to-end-sample/quickstart-get-insights.md

Lines changed: 29 additions & 38 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ ms.author: baanders
66
ms.topic: quickstart
77
ms.custom:
88
- ignite-2023
9-
ms.date: 01/24/2025
9+
ms.date: 01/28/2025
1010

1111
#CustomerIntent: As an OT user, I want to create a visual report for my processed OPC UA data that I can use to analyze and derive insights from it.
1212
---
@@ -27,50 +27,51 @@ You also need to meet the following Fabric requirements:
2727

2828
## What problem will we solve?
2929

30-
Once your OPC UA data has arrived in the cloud, you'll have a lot of information available to analyze. You might want to organize that data and create reports containing graphs and visualizations to derive insights from the data. The steps in this quickstart illustrate how you can connect that data to Real-Time Intelligence and create a Real-Time Dashboard.
30+
Once your OPC UA data has arrived in the cloud, you have a lot of information available to analyze. You might want to organize that data and create reports containing graphs and visualizations to derive insights from the data. The steps in this quickstart illustrate how you can connect that data to Real-Time Intelligence and create a Real-Time Dashboard.
3131

3232
## Ingest data into Real-Time Intelligence
3333

3434
In this section, you set up a Microsoft Fabric *eventstream* to connect your event hub to a KQL database in Real-Time Intelligence. This process includes setting up a data mapping to transform the payload data from its JSON format to columns in KQL.
3535

3636
### Create an eventstream
3737

38-
In this section, you create an eventstream that will be used to bring your data from Event Hubs into Microsoft Fabric Real-Time Intelligence, and eventually into a KQL database.
38+
In this section, you create an eventstream that's used to bring your data from Event Hubs into Microsoft Fabric Real-Time Intelligence, and eventually into a KQL database.
3939

40-
Start by navigating to the [Real-Time hub in Microsoft Fabric](https://app.powerbi.com/workloads/oneriver/hub?experience=fabric-developer).
40+
Start by navigating to the [Real-Time hub in Microsoft Fabric](https://app.powerbi.com/workloads/oneriver/hub?experience=fabric-developer).
4141

42-
Follow the steps in [Get events from Azure Event Hubs into Real-time hub](/fabric/real-time-hub/add-source-azure-event-hubs) to add your event hub as a data source for a new eventstream in your Fabric workspace. Keep the following notes in mind:
42+
Add your event hub as a data source for a new eventstream (for detailed instructions, see [Get events from Azure Event Hubs into Real-time hub](/fabric/real-time-hub/add-source-azure-event-hubs#microsoft-sources-page)). As you add the data source, keep the following notes in mind:
4343

44+
* For **Azure Event Hub Key**, use the default selection (*RootManageSharedAccessKey*).
4445
* You can edit the **Eventstream name** to something friendly in the **Stream details** pane.
4546
* For **Connection**, create a new connection with Shared Access Key authentication.
4647
* Make sure local authentication is enabled on your Event Hubs namespace. You can set this from its Overview page in the Azure portal.
4748
* For **Consumer group**, use the default selection (*$Default*).
48-
* For **Data format**, choose *Json* (it might be selected already by default).
49+
* For **Data format**, use the default selection (*Json*).
4950

50-
After creating the eventstream, open it to see it in the authoring canvas. Your Azure event hub is visible as a source for the eventstream.
51+
After connecting the eventstream, use the **Open Eventstream** button to see it in the authoring canvas. The stream from your Azure event hub is visible as an eventstream source.
5152

5253
:::image type="content" source="media/quickstart-get-insights/source-added.png" alt-text="Screenshot of the eventstream with an AzureEventHub source.":::
5354

5455
#### Verify dataflow
5556

5657
Follow these steps to check your work so far, and make sure data is flowing into the eventstream.
5758

58-
1. Start your cluster where you deployed Azure IoT Operations in earlier quickstarts. The OPC PLC simulator you deployed with your Azure IoT Operations instance should begin running and sending data. You can [verify that your event hub is receiving messages](quickstart-configure.md#verify-data-is-flowing-to-event-hubs) in the Azure portal.
59+
1. Start your cluster where you deployed Azure IoT Operations in earlier quickstarts. The OPC PLC simulator you deployed with your Azure IoT Operations instance should begin running and sending data. You can confirm this by [verifying that your event hub is receiving messages](quickstart-configure.md#verify-data-is-flowing-to-event-hubs) in the Azure portal.
5960

60-
1. Wait a few minutes for data to propagate. Then, in the eventstream live view, select the Azure event hub source and refresh the **Data preview**. You should see JSON data from the simulator begin to appear in the table.
61+
1. Wait a few minutes for data to propagate. Then, in the eventstream live view, select the eventstream source and refresh the **Data preview**. You should see JSON data from the simulator begin to appear in the table.
6162

6263
:::image type="content" source="media/quickstart-get-insights/source-added-data.png" alt-text="Screenshot of the eventstream with data from the AzureEventHub source.":::
6364

6465
>[!TIP]
65-
>If data has not arrived in your eventstream, you may want to check your event hub activity to This will help you isolate which section of the flow to debug.
66+
>If data isn't arriving in your eventstream, you may want to check your event hub activity to help you isolate which section of the flow to debug.
6667
6768
### Prepare KQL resources
6869

6970
In this section, you create a KQL database in your Microsoft Fabric workspace to use as a destination for your data.
7071

71-
1. Follow the steps in [Create an eventhouse](/fabric/real-time-intelligence/create-eventhouse#create-an-eventhouse-1) to create a Real-Time Intelligence eventhouse with a child KQL database. You only need to complete the section entitled **Create an eventhouse**.
72+
1. First, create a Real-Time Intelligence eventhouse (for detailed instructions, see [Create an eventhouse](/fabric/real-time-intelligence/create-eventhouse#create-an-eventhouse-1)). When the eventhouse is created, it automatically contains a default KQL database of the same name.
7273

73-
1. Next, create a table in your database. Call it *OPCUA* and use the following columns.
74+
1. Next, create a new table in the default database in your eventhouse (for detailed instructions, see [Create an empty table in your KQL database](/fabric/real-time-intelligence/create-empty-table#create-an-empty-table-in-your-kql-database)). Name it *OPCUA* and manually enter the following schema.
7475

7576
| Column name | Data type |
7677
| --- | --- |
@@ -81,51 +82,41 @@ In this section, you create a KQL database in your Microsoft Fabric workspace to
8182
| EnergyUse | decimal |
8283
| Timestamp | datetime |
8384

84-
1. After the *OPCUA* table has been created, select it and use the **Explore your data** button to open a query window for the table.
85+
1. After the *OPCUA* table is created, select it and use the **Query with code** button to open any sample query in a new query window for the table.
8586

86-
:::image type="content" source="media/quickstart-get-insights/explore-your-data.png" alt-text="Screenshot showing the Explore your data button.":::
87+
:::image type="content" source="media/quickstart-get-insights/query-with-code.png" alt-text="Screenshot showing the Query with code button.":::
8788

88-
1. Run the following KQL query to create a data mapping for your table. The data mapping will be called *opcua_mapping*.
89+
1. Clear the sample query, and run the following KQL query to create a data mapping for your table. The data mapping is called *opcua_mapping*.
8990

9091
```kql
9192
.create table ['OPCUA'] ingestion json mapping 'opcua_mapping' '[{"column":"AssetId", "Properties":{"Path":"$[\'AssetId\']"}},{"column":"Spike", "Properties":{"Path":"$.Spike"}},{"column":"Temperature", "Properties":{"Path":"$.TemperatureF"}},{"column":"FillWeight", "Properties":{"Path":"$.FillWeight.Value"}},{"column":"EnergyUse", "Properties":{"Path":"$.EnergyUse.Value"}},{"column":"Timestamp", "Properties":{"Path":"$[\'EventProcessedUtcTime\']"}}]'
9293
```
9394
94-
### Add data table as a destination
95+
### Add eventstream data to KQL database
9596
96-
Next, return to your eventstream view, where you can add your new KQL table as an eventstream destination.
97+
Next, add your eventstream as a data source for your KQL table (for detailed instructions, see [Get data from Eventstream](/fabric/real-time-intelligence/get-data-eventstream#source)). As you add the data source, keep the following notes in mind:
9798
98-
Follow the steps in [Add a KQL Database destination to an eventstream](/fabric/real-time-intelligence/event-streams/add-destination-kql-database?pivots=standard-capabilities#direct-ingestion-mode) to add the destination. Keep the following notes in mind:
99-
100-
- Use direct ingestion mode.
101-
- On the **Configure** step, select the *OPCUA* table that you created earlier.
102-
- On the **Inspect** step, open the **Advanced** options. Under **Mapping**, select **Existing mapping** and choose *opcua_mapping*.
99+
* Use the *OPCUA* table as the destination table and your eventstream as the source.
100+
* On the **Inspect** step, open the **Advanced** options. Under **Mapping**, select **Existing mapping** and choose *opcua_mapping*.
103101
104102
:::image type="content" source="media/quickstart-get-insights/existing-mapping.png" alt-text="Screenshot adding an existing mapping.":::
105103
106-
>[!TIP]
107-
>If no existing mappings are found, try refreshing the eventstream editor and restarting the steps to add the destination. Alternatively, you can initiate this same configuration process from the KQL table instead of from the eventstream, as described in [Get data from Eventstream](/fabric/real-time-intelligence/get-data-eventstream).
108-
109-
After completing this flow, the KQL table is visible in the eventstream live view as a destination.
110-
111-
Wait a few minutes for data to propagate. Then, select the KQL destination and refresh the **Data preview** to see the processed JSON data from the eventstream appearing in the table.
112-
113-
:::image type="content" source="media/quickstart-get-insights/destination-added-data.png" alt-text="Screenshot of the eventstream with data in the KQL database destination.":::
104+
After you complete this setup, data begins to flow through your eventstream and is processed into your KQL table.
114105
115-
If you want, you can also view and query this data in your KQL database directly.
106+
Wait a few minutes for data to propagate. Then, select the *OPCUA* table to see a preview of the data from the eventstream appearing in the table.
116107
117-
:::image type="content" source="media/quickstart-get-insights/query-kql.png" alt-text="Screenshot of the same data being queried from the KQL database.":::
108+
:::image type="content" source="media/quickstart-get-insights/kql-data-preview.png" alt-text="Screenshot of the OPCUA table with data.":::
118109
119110
## Create a Real-Time Dashboard
120111
121-
In this section, you'll create a new [Real-Time Dashboard](/fabric/real-time-intelligence/dashboard-real-time-create) to visualize your quickstart data, and import a set of tiles from a sample dashboard template. The dashboard will allow filtering by asset ID and timestamp, and will display visual summaries of temperature, spike frequency, and other data.
112+
In this section, you create a new [Real-Time Dashboard](/fabric/real-time-intelligence/dashboard-real-time-create) to visualize your quickstart data, and import a set of tiles from a sample dashboard template. The dashboard allows filtering by asset ID and timestamp, and displays visual summaries of temperature, spike frequency, and other data.
122113
123114
>[!NOTE]
124115
>You can only create Real-Time Dashboards if your tenant admin has enabled the creation of Real-Time Dashboards in your Fabric tenant. For more information, see [Enable tenant settings in the admin portal](/fabric/real-time-intelligence/dashboard-real-time-create#enable-tenant-settings-in-the-admin-portal).
125116
126117
### Create dashboard
127118
128-
Follow the steps in the [Create a new dashboard](/fabric/real-time-intelligence/dashboard-real-time-create#create-a-new-dashboard) section to create a new Real-Time Dashboard from the Real-Time Intelligence capabilities.
119+
Navigate to your workspace and create a new Real-Time Dashboard from the Real-Time Intelligence capabilities (for detailed instructions, see [Create a new dashboard](/fabric/real-time-intelligence/dashboard-real-time-create#create-a-new-dashboard)).
129120
130121
### Upload template and connect data source
131122
@@ -136,10 +127,10 @@ Then, follow the steps below to upload the dashboard template and connect it to
136127
:::image type="content" source="media/quickstart-get-insights/dashboard-upload-replace.png" alt-text="Screenshot of the buttons to upload a file template.":::
137128
1. Select the template file that you downloaded to your machine.
138129
1. The template file populates the dashboard with multiple tiles, although the tiles can't get data since you haven't yet connected your data source.
139-
:::image type="content" source="media/quickstart-get-insights/dashboard-upload-errors.png" alt-text="Screenshot of the dashboard with erroring visuals.":::
130+
:::image type="content" source="media/quickstart-get-insights/dashboard-upload-errors.png" alt-text="Screenshot of the dashboard with errors in the visuals.":::
140131
1. From the **Manage** tab, select **Data sources**. This opens the **Data sources** pane with a sample source for your AIO data. Select the pencil icon to edit the *AIOdata* data source.
141132
:::image type="content" source="media/quickstart-get-insights/dashboard-data-sources.png" alt-text="Screenshot of the buttons to connect a data source.":::
142-
1. Choose your database (it will be under **OneLake data hub**). When you're finished selecting your data source, select **Apply** and close the **Data sources** pane.
133+
1. Choose your database (it's under **OneLake data hub**). When you're finished selecting your data source, select **Apply** and close the **Data sources** pane.
143134
144135
The visuals should populate with the data from your KQL database.
145136
@@ -150,7 +141,7 @@ The visuals should populate with the data from your KQL database.
150141
Now you have a dashboard that displays different types of visuals for the asset data in these quickstarts. Here are the visuals included with the template:
151142
* Parameters for your dashboard that allow all visuals to be filtered by timestamp (included by default) and asset ID.
152143
* A line chart tile showing temperature and its spikes over time.
153-
* A stat tile showing a real-time spike indicator for temperature. The tile displays the most recent temperature value, and if that value is a spike, conditional formatting will display it as a warning.
144+
* A stat tile showing a real-time spike indicator for temperature. The tile displays the most recent temperature value, and if that value is a spike, conditional formatting displays it as a warning.
154145
* A stat tile showing max temperature.
155146
* A stat tile showing the number of spikes in the selected time frame.
156147
* A line chart tile showing temperature versus fill weight over time.
@@ -169,4 +160,4 @@ Now that you're finished with the quickstart experience, this section contains i
169160
> [!NOTE]
170161
> The resource group contains the Event Hubs namespace you created in this quickstart.
171162
172-
You can also delete your Microsoft Fabric workspace and/or all the resources within it associated with this quickstart, including the eventstream, eventhouse, and Real-Time Dashboard. Additionally, you may want to delete the dashboard template file that you downloaded to your computer.
163+
You can also delete your Microsoft Fabric workspace and/or all the resources within it associated with this quickstart, including the eventstream, eventhouse, and Real-Time Dashboard. Additionally, you might want to delete the dashboard template file that you downloaded to your computer.

0 commit comments

Comments
 (0)