Skip to content

Commit 3409180

Browse files
Merge pull request #290906 from PatAltimore/patricka-create-dataflow-fixes
Add serialization screenshot for storage endpoint
2 parents a32f77c + 0f03087 commit 3409180

File tree

2 files changed

+4
-0
lines changed

2 files changed

+4
-0
lines changed

articles/iot-operations/connect-to-cloud/howto-create-dataflow.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -846,6 +846,10 @@ To send data to a destination other than the local MQTT broker, create a dataflo
846846

847847
:::image type="content" source="media/howto-create-dataflow/dataflow-destination.png" alt-text="Screenshot using operations experience to select Event Hubs destination endpoint.":::
848848

849+
Storage endpoints require a [schema for serialization](./concept-schema-registry.md). If you choose a Microsoft Fabric OneLake, Azure Data Lake Storage, Azure Data Explorer, or Local Storage destination endpoint, you must [specify a schema reference](#serialize-data-according-to-a-schema). For example, to serialize the data to a Microsoft Fabric endpoint in Delta format, you need to upload a schema to the schema registry and reference it in the dataflow destination endpoint configuration.
850+
851+
:::image type="content" source="media/howto-create-dataflow/serialization-schema.png" alt-text="Screenshot using operations experience to choose output schema and serialization format.":::
852+
849853
1. Select **Proceed** to configure the destination.
850854
1. Enter the required settings for the destination, including the topic or table to send the data to. See [Configure data destination (topic, container, or table)](#configure-data-destination-topic-container-or-table) for more information.
851855

31.6 KB
Loading

0 commit comments

Comments
 (0)