Skip to content

Commit 734c2d1

Browse files
Merge pull request #290611 from jlian/fabric-tutorial
ADLSv2 tutorial
2 parents 24b723a + 5b55849 commit 734c2d1

20 files changed

+402
-107
lines changed

articles/iot-operations/.openpublishing.redirection.iot-operations.json

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -539,6 +539,11 @@
539539
"source_path_from_root": "/articles/iot-operations/reference/observability-metrics-mq.md",
540540
"redirect_url": "/azure/iot-operations/reference/observability-metrics-mqtt-broker",
541541
"redirect_document_id": false
542+
},
543+
{
544+
"source_path_from_root": "/articles/iot-operations/view-analyze-telemetry/tutorial-real-time-dashboard-fabric.md",
545+
"redirect_url": "/azure/iot-operations/end-to-end-tutorials/tutorial-add-assets",
546+
"redirect_document_id": false
542547
}
543548
]
544549
}

articles/iot-operations/connect-to-cloud/concept-schema-registry.md

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -81,6 +81,12 @@ Delta:
8181
}
8282
```
8383

84+
### Generate a schema
85+
86+
To generate the schema from a sample data file, use the [Schema Gen Helper](https://azure-samples.github.io/explore-iot-operations/schema-gen-helper/).
87+
88+
For a tutorial that uses the schema generator, see [Tutorial: Send data from an OPC UA server to Azure Data Lake Storage Gen 2](./tutorial-opc-ua-to-data-lake.md).
89+
8490
## How dataflows use message schemas
8591

8692
Message schemas are used in all three phases of a dataflow: defining the source input, applying data transformations, and creating the destination output.

articles/iot-operations/connect-to-cloud/howto-configure-adlsv2-endpoint.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -449,4 +449,5 @@ dataLakeStorageSettings:
449449

450450
## Next steps
451451

452-
To learn more about dataflows, see [Create a dataflow](howto-create-dataflow.md).
452+
- To learn more about dataflows, see [Create a dataflow](howto-create-dataflow.md).
453+
- To see a tutorial on how to use a dataflow to send data to Azure Data Lake Storage Gen2, see [Tutorial: Send data to Azure Data Lake Storage Gen2](./tutorial-opc-ua-to-data-lake.md).

articles/iot-operations/connect-to-cloud/howto-configure-dataflow-endpoint.md

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -28,6 +28,11 @@ Use the following table to choose the endpoint type to configure:
2828
| [Azure Data Explorer](howto-configure-adx-endpoint.md) | For uploading data to Azure Data Explorer databases. | No | Yes |
2929
| [Local storage](howto-configure-local-storage-endpoint.md) | For sending data to a locally available persistent volume, through which you can upload data via Azure Container Storage enabled by Azure Arc edge volumes. | No | Yes |
3030

31+
> [!IMPORTANT]
32+
> Storage endpoints require a [schema for serialization](./concept-schema-registry.md). To use dataflow with Microsoft Fabric OneLake, Azure Data Lake Storage, Azure Data Explorer, or Local Storage, you must [specify a schema reference](./howto-create-dataflow.md#serialize-data-according-to-a-schema).
33+
>
34+
> To generate the schema from a sample data file, use the [Schema Gen Helper](https://azure-samples.github.io/explore-iot-operations/schema-gen-helper/).
35+
3136
## Dataflows must use local MQTT broker endpoint
3237

3338
When you create a dataflow, you specify the source and destination endpoints. The dataflow moves data from the source endpoint to the destination endpoint. You can use the same endpoint for multiple dataflows, and you can use the same endpoint as both the source and destination in a dataflow.

articles/iot-operations/connect-to-cloud/howto-create-dataflow.md

Lines changed: 7 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -436,6 +436,9 @@ When using MQTT or Kafka as the source, you can specify a [schema](concept-schem
436436

437437
If the source is an asset, the schema is automatically inferred from the asset definition.
438438

439+
> [!TIP]
440+
> To generate the schema from a sample data file, use the [Schema Gen Helper](https://azure-samples.github.io/explore-iot-operations/schema-gen-helper/).
441+
439442
To configure the schema used to deserialize the incoming messages from a source:
440443

441444
# [Portal](#tab/portal)
@@ -784,6 +787,9 @@ builtInTransformationSettings:
784787

785788
If you want to serialize the data before sending it to the destination, you need to specify a schema and serialization format. Otherwise, the data is serialized in JSON with the types inferred. Storage endpoints like Microsoft Fabric or Azure Data Lake require a schema to ensure data consistency. Supported serialization formats are Parquet and Delta.
786789

790+
> [!TIP]
791+
> To generate the schema from a sample data file, use the [Schema Gen Helper](https://azure-samples.github.io/explore-iot-operations/schema-gen-helper/).
792+
787793
# [Portal](#tab/portal)
788794

789795
For operations experience, you specify the schema and serialization format in the dataflow endpoint details. The endpoints that support serialization formats are Microsoft Fabric OneLake, Azure Data Lake Storage Gen 2, and Azure Data Explorer. For example, to serialize the data in Delta format, you need to upload a schema to the schema registry and reference it in the dataflow destination endpoint configuration.
@@ -822,7 +828,7 @@ To configure a destination for the dataflow, specify the endpoint reference and
822828
To send data to a destination other than the local MQTT broker, create a dataflow endpoint. To learn how, see [Configure dataflow endpoints](howto-configure-dataflow-endpoint.md). If the destination isn't the local MQTT broker, it must be used as a source. To learn more about, see [Dataflows must use local MQTT broker endpoint](./howto-configure-dataflow-endpoint.md#dataflows-must-use-local-mqtt-broker-endpoint).
823829

824830
> [!IMPORTANT]
825-
> Storage endpoints require a schema reference. If you've created storage destination endpoints for Microsoft Fabric OneLake, ADLS Gen 2, Azure Data Explorer and Local Storage, you must specify schema reference.
831+
> Storage endpoints require a [schema for serialization](./concept-schema-registry.md). To use dataflow with Microsoft Fabric OneLake, Azure Data Lake Storage, Azure Data Explorer, or Local Storage, you must [specify a schema reference](#serialize-data-according-to-a-schema).
826832

827833
# [Portal](#tab/portal)
828834

111 KB
Loading
133 KB
Loading
116 KB
Loading
60.2 KB
Loading
99.8 KB
Loading

0 commit comments

Comments
 (0)