Skip to content

Commit 7f18f8d

Browse files
author
Jill Grant
authored
Merge pull request #290612 from PatAltimore/patricka-dataflow-schema-release-aio-ga
Add operations experience serialization
2 parents c23c6ea + df6e642 commit 7f18f8d

File tree

2 files changed

+3
-1
lines changed

2 files changed

+3
-1
lines changed

articles/iot-operations/connect-to-cloud/howto-create-dataflow.md

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -786,7 +786,9 @@ If you want to serialize the data before sending it to the destination, you need
786786

787787
# [Portal](#tab/portal)
788788

789-
Currently, specifying the output schema and serialization isn't supported in the operations experience.
789+
For operations experience, you specify the schema and serialization format in the dataflow endpoint details. The endpoints that support serialization formats are Microsoft Fabric OneLake, Azure Data Lake Storage Gen 2, and Azure Data Explorer. For example, to serialize the data in Delta format, you need to upload a schema to the schema registry and reference it in the dataflow destination endpoint configuration.
790+
791+
:::image type="content" source="media/howto-create-dataflow/destination-serialization.png" alt-text="Screenshot using the operations experience to set the dataflow destination endpoint serialization.":::
790792

791793
# [Bicep](#tab/bicep)
792794

43.2 KB
Loading

0 commit comments

Comments
 (0)