Skip to content

Commit d9c7970

Browse files
committed
Some additional text
1 parent 4e11177 commit d9c7970

File tree

1 file changed

+6
-2
lines changed

1 file changed

+6
-2
lines changed

articles/iot-operations/connect-to-cloud/tutorial-adlsv2.md

Lines changed: 6 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -7,15 +7,15 @@ ms.author: patricka
77
ms.topic: how-to
88
ms.date: 11/15/2024
99

10-
#CustomerIntent: As an operator, I want to learn how to build a dashboard in Microsoft Fabric using OPC UA data Azure IoT Operations.
10+
#CustomerIntent: As an operator, I want to send data from an OPC UA server to Azure Data Lake Storage Gen 2 using Azure IoT Operations so that I can store the data for further analysis and processing.
1111
ms.service: azure-iot-operations
1212
---
1313

1414
# Send data from an OPC UA server to Azure Data Lake Storage Gen 2
1515

1616
In the quickstart, you created a dataflow that sends data from Azure IoT Operations to Event Hubs, and then to Microsoft Fabric via EventStreams.
1717

18-
However, it's also possible to send the data directly to a storage endpoint without using Event Hubs.
18+
However, it's also possible to send the data directly to a storage endpoint without using Event Hubs. This approach requires creating a Delta Lake schema that represents the data, uploading the schema to Azure IoT Operations, and then creating a dataflow that reads the data from the OPC UA server and writes it to the storage endpoint.
1919

2020
This tutorial builds on the quickstart setup and demonstrates how to bifurcate the data to Azure Data Lake Storage Gen 2. This approach allows you to store the data directly in a scalable and secure data lake, which can be used for further analysis and processing.
2121

@@ -203,6 +203,8 @@ az iot ops schema version list -g <RESOURCE_GROUP> --schema opcua-schema --regis
203203

204204
## Create dataflow endpoint
205205

206+
The dataflow endpoint is the destination where the data is sent. In this case, the data is sent to Azure Data Lake Storage Gen 2. The authentication method is system assigned managed identity, which you set up to have right permissions to write to the storage account.
207+
206208
Create a dataflow endpoint using Bicep. Replace the placeholders with your values.
207209

208210
```bicep
@@ -250,6 +252,8 @@ az deployment group create -g <RESOURCE_GROUP> --template-file adls-gen2-endpoin
250252

251253
## Create a dataflow
252254

255+
To send data to Azure Data Lake Storage Gen 2, you need to create a dataflow that reads data from the OPC UA server and writes it to the storage account. No transformation is needed in this case, so the data is written as-is.
256+
253257
Create a dataflow using Bicep. Replace the placeholders with your values.
254258

255259
```bicep

0 commit comments

Comments
 (0)