You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/iot-operations/connect-to-cloud/tutorial-adlsv2.md
+6-2Lines changed: 6 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -7,15 +7,15 @@ ms.author: patricka
7
7
ms.topic: how-to
8
8
ms.date: 11/15/2024
9
9
10
-
#CustomerIntent: As an operator, I want to learn how to build a dashboard in Microsoft Fabric using OPC UA data Azure IoT Operations.
10
+
#CustomerIntent: As an operator, I want to send data from an OPC UA server to Azure Data Lake Storage Gen 2 using Azure IoT Operations so that I can store the data for further analysis and processing.
11
11
ms.service: azure-iot-operations
12
12
---
13
13
14
14
# Send data from an OPC UA server to Azure Data Lake Storage Gen 2
15
15
16
16
In the quickstart, you created a dataflow that sends data from Azure IoT Operations to Event Hubs, and then to Microsoft Fabric via EventStreams.
17
17
18
-
However, it's also possible to send the data directly to a storage endpoint without using Event Hubs.
18
+
However, it's also possible to send the data directly to a storage endpoint without using Event Hubs. This approach requires creating a Delta Lake schema that represents the data, uploading the schema to Azure IoT Operations, and then creating a dataflow that reads the data from the OPC UA server and writes it to the storage endpoint.
19
19
20
20
This tutorial builds on the quickstart setup and demonstrates how to bifurcate the data to Azure Data Lake Storage Gen 2. This approach allows you to store the data directly in a scalable and secure data lake, which can be used for further analysis and processing.
21
21
@@ -203,6 +203,8 @@ az iot ops schema version list -g <RESOURCE_GROUP> --schema opcua-schema --regis
203
203
204
204
## Create dataflow endpoint
205
205
206
+
The dataflow endpoint is the destination where the data is sent. In this case, the data is sent to Azure Data Lake Storage Gen 2. The authentication method is system assigned managed identity, which you set up to have right permissions to write to the storage account.
207
+
206
208
Create a dataflow endpoint using Bicep. Replace the placeholders with your values.
207
209
208
210
```bicep
@@ -250,6 +252,8 @@ az deployment group create -g <RESOURCE_GROUP> --template-file adls-gen2-endpoin
250
252
251
253
## Create a dataflow
252
254
255
+
To send data to Azure Data Lake Storage Gen 2, you need to create a dataflow that reads data from the OPC UA server and writes it to the storage account. No transformation is needed in this case, so the data is written as-is.
256
+
253
257
Create a dataflow using Bicep. Replace the placeholders with your values.
0 commit comments