Skip to content

Commit 26847ff

Browse files
committed
sample schemas
1 parent 824c89a commit 26847ff

File tree

1 file changed

+70
-2
lines changed

1 file changed

+70
-2
lines changed

articles/iot-operations/connect-to-cloud/concept-schema-registry.md

Lines changed: 70 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -19,6 +19,74 @@ Edge services use message schemas to filter and transform messages as they're ro
1919

2020
*Schemas* are documents that describe data to enable processing and contextualization. *Message schemas* describe the format of a message and its contents.
2121

22+
## Message schema definitions
23+
24+
Schema registry expects the following required fields in a message schema:
25+
26+
| Required field | Definition |
27+
| `$schema` | Either `http://json-schema.org/draft-07/schema#` or `Delta/1.0`. In dataflows, JSON schemas are used for source endpoints and Delta schemas are used for destination endpoints. |
28+
| `type` | `Object` |
29+
| `properties` | The message definition. |
30+
31+
Additionally, you can include optional fields in your message schema as long as they conform with the schema type. For example:
32+
33+
| Optional field | Definition |
34+
| `name` | A name for the schema. |
35+
| `description` | A description of the schema. |
36+
| `required` | A list of properties required to be present in the messages. |
37+
38+
### Sample schemas
39+
40+
The following sample schemas provide examples for defining message schemas in each format.
41+
42+
JSON:
43+
44+
```json
45+
{
46+
"$schema": "http://json-schema.org/draft-07/schema#",
47+
"name": "foobarbaz",
48+
"description": "A representation of an event",
49+
"type": "object",
50+
"required": [ "dtstart", "summary" ],
51+
"properties": {
52+
"summary": {
53+
"type": "string"
54+
},
55+
"location": {
56+
"type": "string"
57+
},
58+
"url": {
59+
"type": "string"
60+
},
61+
"duration": {
62+
"type": "string",
63+
"description": "Event duration"
64+
}
65+
}
66+
}
67+
```
68+
69+
Delta:
70+
71+
```delta
72+
{
73+
"$schema": "Delta/1.0",
74+
"type": "object",
75+
"properties": {
76+
"type": "struct",
77+
"fields": [
78+
{ "name": "asset_id", "type": "string", "nullable": false, "metadata": {} },
79+
{ "name": "asset_name", "type": "string", "nullable": false, "metadata": {} },
80+
{ "name": "location", "type": "string", "nullable": false, "metadata": {} },
81+
{ "name": "manufacturer", "type": "string", "nullable": false, "metadata": {} },
82+
{ "name": "production_date", "type": "string", "nullable": false, "metadata": {} },
83+
{ "name": "serial_number", "type": "string", "nullable": false, "metadata": {} },
84+
{ "name": "temperature", "type": "double", "nullable": false, "metadata": {} }
85+
]
86+
}
87+
}
88+
```
89+
2290
## How dataflows use message schemas
2391

2492
Message schemas are used in all three phases of a dataflow: defining the source input, applying data tranformations, and creating the destination output.
@@ -29,7 +97,7 @@ Each dataflow source requires a message schema.
2997

3098
Asset sources have a predefined message schema that was created by the connector for OPC UA.
3199

32-
MQTT sources require an uploaded message schema. Azure IoT Operations supports JSON schemas, and the filename is used as the schema name. In the operations experience, you can select an existing schema or upload one while defining an MQTT source:
100+
MQTT sources require an uploaded message schema. Currently, Azure IoT Operations supports JSON for input schemas. In the operations experience, you can select an existing schema or upload one while defining an MQTT source:
33101

34102
:::image type="content" source="./media/concept-schema-registry/upload-schema.png" alt-text="Screenshot that shows uploading a message schema in the operations experience portal.":::
35103

@@ -39,7 +107,7 @@ The operations experience uses the input schema as a starting point for your dat
39107

40108
### Output schema
41109

42-
Schemas are only used for dataflows that select Fabric or ADX as the destination endpoint.
110+
Schemas are only used for dataflows that select local storage, Fabric, Azure Data Lake, or Azure Data Explorer as the destination endpoint. Currently, Azure IoT Operations supports Delta Parquey for outpus schema.
43111

44112
For these dataflows, the operations experience applies any transformations to the input schema then creates a new schema in Delta format. When the dataflow custom resource (CR) is created, it includes a `schemaRef` value that points to the generated schema stored in the schema registry.
45113

0 commit comments

Comments
 (0)