You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/iot-operations/connect-to-cloud/concept-schema-registry.md
+6Lines changed: 6 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -81,6 +81,12 @@ Delta:
81
81
}
82
82
```
83
83
84
+
### Generate a schema
85
+
86
+
To generate the schema from a sample data file, use the [Schema Gen Helper](https://azure-samples.github.io/explore-iot-operations/schema-gen-helper/).
87
+
88
+
For a tutorial that uses the schema generator, see [Tutorial: Send data from an OPC UA server to Azure Data Lake Storage Gen 2](./tutorial-opc-ua-to-data-lake.md).
89
+
84
90
## How dataflows use message schemas
85
91
86
92
Message schemas are used in all three phases of a dataflow: defining the source input, applying data transformations, and creating the destination output.
Copy file name to clipboardExpand all lines: articles/iot-operations/connect-to-cloud/howto-configure-adlsv2-endpoint.md
+2-1Lines changed: 2 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -449,4 +449,5 @@ dataLakeStorageSettings:
449
449
450
450
## Next steps
451
451
452
-
To learn more about dataflows, see [Create a dataflow](howto-create-dataflow.md).
452
+
- To learn more about dataflows, see [Create a dataflow](howto-create-dataflow.md).
453
+
- To see a tutorial on how to use a dataflow to send data to Azure Data Lake Storage Gen2, see [Tutorial: Send data to Azure Data Lake Storage Gen2](./tutorial-opc-ua-to-data-lake.md).
Copy file name to clipboardExpand all lines: articles/iot-operations/connect-to-cloud/howto-configure-dataflow-endpoint.md
+5Lines changed: 5 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -28,6 +28,11 @@ Use the following table to choose the endpoint type to configure:
28
28
|[Azure Data Explorer](howto-configure-adx-endpoint.md)| For uploading data to Azure Data Explorer databases. | No | Yes |
29
29
|[Local storage](howto-configure-local-storage-endpoint.md)| For sending data to a locally available persistent volume, through which you can upload data via Azure Container Storage enabled by Azure Arc edge volumes. | No | Yes |
30
30
31
+
> [!IMPORTANT]
32
+
> Storage endpoints require a [schema for serialization](./concept-schema-registry.md). To use dataflow with Microsoft Fabric OneLake, Azure Data Lake Storage, Azure Data Explorer, or Local Storage, you must [specify a schema reference](./howto-create-dataflow.md#serialize-data-according-to-a-schema).
33
+
>
34
+
> To generate the schema from a sample data file, use the [Schema Gen Helper](https://azure-samples.github.io/explore-iot-operations/schema-gen-helper/).
35
+
31
36
## Dataflows must use local MQTT broker endpoint
32
37
33
38
When you create a dataflow, you specify the source and destination endpoints. The dataflow moves data from the source endpoint to the destination endpoint. You can use the same endpoint for multiple dataflows, and you can use the same endpoint as both the source and destination in a dataflow.
Copy file name to clipboardExpand all lines: articles/iot-operations/connect-to-cloud/howto-create-dataflow.md
+7-1Lines changed: 7 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -436,6 +436,9 @@ When using MQTT or Kafka as the source, you can specify a [schema](concept-schem
436
436
437
437
If the source is an asset, the schema is automatically inferred from the asset definition.
438
438
439
+
> [!TIP]
440
+
> To generate the schema from a sample data file, use the [Schema Gen Helper](https://azure-samples.github.io/explore-iot-operations/schema-gen-helper/).
441
+
439
442
To configure the schema used to deserialize the incoming messages from a source:
If you want to serialize the data before sending it to the destination, you need to specify a schema and serialization format. Otherwise, the data is serialized in JSON with the types inferred. Storage endpoints like Microsoft Fabric or Azure Data Lake require a schema to ensure data consistency. Supported serialization formats are Parquet and Delta.
786
789
790
+
> [!TIP]
791
+
> To generate the schema from a sample data file, use the [Schema Gen Helper](https://azure-samples.github.io/explore-iot-operations/schema-gen-helper/).
792
+
787
793
# [Portal](#tab/portal)
788
794
789
795
For operations experience, you specify the schema and serialization format in the dataflow endpoint details. The endpoints that support serialization formats are Microsoft Fabric OneLake, Azure Data Lake Storage Gen 2, and Azure Data Explorer. For example, to serialize the data in Delta format, you need to upload a schema to the schema registry and reference it in the dataflow destination endpoint configuration.
@@ -822,7 +828,7 @@ To configure a destination for the dataflow, specify the endpoint reference and
822
828
To send data to a destination other than the local MQTT broker, create a dataflow endpoint. To learn how, see [Configure dataflow endpoints](howto-configure-dataflow-endpoint.md). If the destination isn't the local MQTT broker, it must be used as a source. To learn more about, see [Dataflows must use local MQTT broker endpoint](./howto-configure-dataflow-endpoint.md#dataflows-must-use-local-mqtt-broker-endpoint).
823
829
824
830
> [!IMPORTANT]
825
-
> Storage endpoints require a schema reference. If you've created storage destination endpoints for Microsoft Fabric OneLake, ADLS Gen 2, Azure Data Explorer and Local Storage, you must specify schema reference.
831
+
> Storage endpoints require a [schema for serialization](./concept-schema-registry.md). To use dataflow with Microsoft Fabric OneLake, Azure Data Lake Storage, Azure Data Explorer, or Local Storage, you must [specify a schema reference](#serialize-data-according-to-a-schema).
0 commit comments