Skip to content

Commit 231aeb3

Browse files
committed
Validation scrub
1 parent 07a98e0 commit 231aeb3

14 files changed

+360
-363
lines changed

articles/iot-operations/connect-to-cloud/concept-dataflow-conversions.md

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ author: PatAltimore
55
ms.author: patricka
66
ms.subservice: azure-data-flows
77
ms.topic: concept-article
8-
ms.date: 08/03/2024
8+
ms.date: 10/30/2024
99

1010
#CustomerIntent: As an operator, I want to understand how to use dataflow conversions to transform data.
1111
ms.service: azure-iot-operations
@@ -157,7 +157,7 @@ Arrays can be processed by using aggregation functions to compute a single value
157157

158158
```json
159159
{
160-
"Measurements": [2.34, 12.3, 32.4]
160+
"Measurements": [2.34, 12.3, 32.4]
161161
}
162162
```
163163

@@ -251,11 +251,11 @@ Example mapping that uses a missing value:
251251

252252
```json
253253
{
254-
"Employment": {
255-
"Position": "Analyst",
256-
"BaseSalary": 75000,
257-
"WorkingHours": "Regular"
258-
}
254+
"Employment": {
255+
"Position": "Analyst",
256+
"BaseSalary": 75000,
257+
"WorkingHours": "Regular"
258+
}
259259
}
260260
```
261261

@@ -287,7 +287,7 @@ expression: 'if($1 == (), $2, $1)'
287287
```yaml
288288
- inputs:
289289
- BaseSalary # - - - - - - - - - - $1
290-
- $context(position).BaseSalary # - $2
290+
- $context(position).BaseSalary # - $2
291291
output: BaseSalary
292292
expression: if($1 == (), $2, $1)
293293
```

articles/iot-operations/connect-to-cloud/concept-dataflow-enrich.md

Lines changed: 23 additions & 20 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ author: PatAltimore
55
ms.author: patricka
66
ms.subservice: azure-data-flows
77
ms.topic: concept-article
8-
ms.date: 08/13/2024
8+
ms.date: 10/30/2024
99

1010
#CustomerIntent: As an operator, I want to understand how to create a dataflow to enrich data sent to endpoints.
1111
ms.service: azure-iot-operations
@@ -41,8 +41,8 @@ datasets: [
4141
{
4242
key: 'position',
4343
inputs: [
44-
'$source.Position' // - $1
45-
'$context.Position' // -$2
44+
'$source.Position' // - $1
45+
'$context.Position' // - $2
4646
],
4747
expression: '$1 == $2'
4848
}
@@ -55,8 +55,8 @@ datasets: [
5555
datasets:
5656
- key: position
5757
inputs:
58-
- $source.Position # - $1
59-
- $context.Position # -$2
58+
- $source.Position # - $1
59+
- $context.Position # - $2
6060
expression: $1 == $2
6161
```
6262
@@ -70,17 +70,20 @@ When a new record is being processed, the mapper performs the following steps:
7070
# [Bicep](#tab/bicep)
7171

7272
```bicep
73-
inputs: [
74-
'$context(position).WorkingHours' // - $1
75-
]
76-
output: 'WorkingHours'
77-
78-
inputs: [
79-
'BaseSalary' // - - - - - - - - - - - - $1
80-
'$context(position).BaseSalary' // - - $2
81-
]
82-
output: 'BaseSalary'
83-
expression: 'if($1 == (), $2, $1)'
73+
{
74+
inputs: [
75+
'$context(position).WorkingHours' // - $1
76+
]
77+
output: 'WorkingHours'
78+
}
79+
{
80+
inputs: [
81+
'BaseSalary' // - - - - - - - - - - - - $1
82+
'$context(position).BaseSalary' // - - $2
83+
]
84+
output: 'BaseSalary'
85+
expression: 'if($1 == (), $2, $1)'
86+
}
8487
```
8588

8689
# [Kubernetes](#tab/kubernetes)
@@ -108,15 +111,15 @@ It's possible to use multiple datasets:
108111
```bicep
109112
datasets: [
110113
{
111-
key: 'position',
114+
key: 'position'
112115
inputs: [
113116
'$source.Position' // - $1
114117
'$context.Position' // - $2
115118
],
116119
expression: '$1 == $2'
117-
},
120+
}
118121
{
119-
key: 'permissions',
122+
key: 'permissions'
120123
inputs: [
121124
'$source.Position' // - $1
122125
'$context.Position' // - $2
@@ -173,7 +176,7 @@ The input references use the key of the dataset like `position` or `permission`.
173176
```bicep
174177
datasets: [
175178
{
176-
key: 'datasets.parag10.rule42 as position',
179+
key: 'datasets.parag10.rule42 as position'
177180
inputs: [
178181
'$source.Position' // - $1
179182
'$context.Position' // - $2

articles/iot-operations/connect-to-cloud/concept-dataflow-mapping.md

Lines changed: 79 additions & 63 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ author: PatAltimore
55
ms.author: patricka
66
ms.subservice: azure-data-flows
77
ms.topic: concept-article
8-
ms.date: 09/24/2024
8+
ms.date: 10/30/2024
99
ai-usage: ai-assisted
1010

1111
#CustomerIntent: As an operator, I want to understand how to use the dataflow mapping language to transform data.
@@ -66,22 +66,26 @@ The following mapping is an example:
6666
# [Bicep](#tab/bicep)
6767

6868
```bicep
69-
inputs: [
70-
'BirthDate'
71-
]
72-
output: 'Employee.DateOfBirth'
73-
74-
inputs: [
75-
'Position' // - - - $1
76-
'Office' // - - - - $2
77-
]
78-
output: 'Employment.Position'
79-
expression: '$1 + ", " + $2'
80-
81-
inputs: [
82-
'$context(position).BaseSalary'
83-
]
84-
output: 'Employment.BaseSalary'
69+
{
70+
inputs: [
71+
'BirthDate'
72+
]
73+
output: 'Employee.DateOfBirth'
74+
}
75+
{
76+
inputs: [
77+
'Position' // - - - - $1
78+
'Office' // - - - - $2
79+
]
80+
output: 'Employment.Position'
81+
expression: '$1 + ", " + $2'
82+
}
83+
{
84+
inputs: [
85+
'$context(position).BaseSalary'
86+
]
87+
output: 'Employment.BaseSalary'
88+
}
8589
```
8690

8791
# [Kubernetes](#tab/kubernetes)
@@ -403,15 +407,18 @@ Mapping configuration that uses wildcards:
403407
# [Bicep](#tab/bicep)
404408

405409
```bicep
406-
inputs: [
407-
'ColorProperties.*'
408-
]
409-
output: '*'
410-
411-
inputs: [
412-
'TextureProperties.*'
413-
]
414-
output: '*'
410+
{
411+
inputs: [
412+
'ColorProperties.*'
413+
]
414+
output: '*'
415+
}
416+
{
417+
inputs: [
418+
'TextureProperties.*'
419+
]
420+
output: '*'
421+
}
415422
```
416423

417424
# [Kubernetes](#tab/kubernetes)
@@ -622,19 +629,22 @@ When you use the previous example from multi-input wildcards, consider the follo
622629
# [Bicep](#tab/bicep)
623630

624631
```bicep
625-
inputs: [
626-
'*.Max' // - $1
627-
'*.Min' // - $2
628-
]
629-
output: 'ColorProperties.*.Avg'
630-
expression: '($1 + $2) / 2'
631-
632-
inputs: [
633-
'*.Max' // - $1
634-
'*.Min' // - $2
635-
]
636-
output: 'ColorProperties.*.Diff'
637-
expression: 'abs($1 - $2)'
632+
{
633+
inputs: [
634+
'*.Max' // - $1
635+
'*.Min' // - $2
636+
]
637+
output: 'ColorProperties.*.Avg'
638+
expression: '($1 + $2) / 2'
639+
}
640+
{
641+
inputs: [
642+
'*.Max' // - $1
643+
'*.Min' // - $2
644+
]
645+
output: 'ColorProperties.*.Diff'
646+
expression: 'abs($1 - $2)'
647+
}
638648
```
639649

640650
# [Kubernetes](#tab/kubernetes)
@@ -683,19 +693,22 @@ Now, consider a scenario where a specific field needs a different calculation:
683693
# [Bicep](#tab/bicep)
684694

685695
```bicep
686-
inputs: [
687-
'*.Max' // - $1
688-
'*.Min' // - $2
689-
]
690-
output: 'ColorProperties.*'
691-
expression: '($1 + $2) / 2'
692-
693-
inputs: [
694-
'Opacity.Max' // - $1
695-
'Opacity.Min' // - $2
696-
]
697-
output: 'ColorProperties.OpacityAdjusted'
698-
expression: '($1 + $2 + 1.32) / 2'
696+
{
697+
inputs: [
698+
'*.Max' // - $1
699+
'*.Min' // - $2
700+
]
701+
output: 'ColorProperties.*'
702+
expression: '($1 + $2) / 2'
703+
}
704+
{
705+
inputs: [
706+
'Opacity.Max' // - $1
707+
'Opacity.Min' // - $2
708+
]
709+
output: 'ColorProperties.OpacityAdjusted'
710+
expression: '($1 + $2 + 1.32) / 2'
711+
}
699712
```
700713

701714
# [Kubernetes](#tab/kubernetes)
@@ -726,17 +739,20 @@ Consider a special case for the same fields to help decide the right action:
726739
# [Bicep](#tab/bicep)
727740

728741
```bicep
729-
inputs: [
730-
'*.Max' // - $1
731-
'*.Min' // - $2
732-
]
733-
output: 'ColorProperties.*'
734-
expression: '($1 + $2) / 2'
735-
736-
inputs: [
737-
'Opacity.Max' // - $1
738-
'Opacity.Min' // - $2
739-
]
742+
{
743+
inputs: [
744+
'*.Max' // - $1
745+
'*.Min' // - $2
746+
]
747+
output: 'ColorProperties.*'
748+
expression: '($1 + $2) / 2'
749+
}
750+
{
751+
inputs: [
752+
'Opacity.Max' // - $1
753+
'Opacity.Min' // - $2
754+
]
755+
}
740756
```
741757

742758
# [Kubernetes](#tab/kubernetes)

articles/iot-operations/connect-to-cloud/concept-schema-registry.md

Lines changed: 17 additions & 27 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ description: Learn how schema registry handles message schemas to work with Azur
44
author: kgremban
55
ms.author: kgremban
66
ms.topic: conceptual
7-
ms.date: 09/23/2024
7+
ms.date: 10/30/2024
88

99
#CustomerIntent: As an operator, I want to understand how I can use message schemas to filter and transform messages.
1010
---
@@ -64,20 +64,20 @@ Delta:
6464

6565
```delta
6666
{
67-
"$schema": "Delta/1.0",
68-
"type": "object",
69-
"properties": {
70-
"type": "struct",
71-
"fields": [
72-
{ "name": "asset_id", "type": "string", "nullable": false, "metadata": {} },
73-
{ "name": "asset_name", "type": "string", "nullable": false, "metadata": {} },
74-
{ "name": "location", "type": "string", "nullable": false, "metadata": {} },
75-
{ "name": "manufacturer", "type": "string", "nullable": false, "metadata": {} },
76-
{ "name": "production_date", "type": "string", "nullable": false, "metadata": {} },
77-
{ "name": "serial_number", "type": "string", "nullable": false, "metadata": {} },
78-
{ "name": "temperature", "type": "double", "nullable": false, "metadata": {} }
79-
]
80-
}
67+
"$schema": "Delta/1.0",
68+
"type": "object",
69+
"properties": {
70+
"type": "struct",
71+
"fields": [
72+
{ "name": "asset_id", "type": "string", "nullable": false, "metadata": {} },
73+
{ "name": "asset_name", "type": "string", "nullable": false, "metadata": {} },
74+
{ "name": "location", "type": "string", "nullable": false, "metadata": {} },
75+
{ "name": "manufacturer", "type": "string", "nullable": false, "metadata": {} },
76+
{ "name": "production_date", "type": "string", "nullable": false, "metadata": {} },
77+
{ "name": "serial_number", "type": "string", "nullable": false, "metadata": {} },
78+
{ "name": "temperature", "type": "double", "nullable": false, "metadata": {} }
79+
]
80+
}
8181
}
8282
```
8383

@@ -221,21 +221,13 @@ var opcuaSchemaContent = '''
221221
'''
222222
```
223223

224-
Then, define schema resource along with pointers to the existing Azure IoT Operation instance, custom location, and schema registry resources that you have from deploying Azure IoT Operations.
224+
Then, in the same file, just underneath the schema, define the schema resource along with pointers to the existing schema registry resource that you have from deploying Azure IoT Operations.
225225

226226
```bicep
227227
// Replace placeholder values with your actual resource names
228-
param customLocationName string = '<CUSTOM_LOCATION_NAME>'
229-
param aioInstanceName string = '<AIO_INSTANCE_NAME>'
230228
param schemaRegistryName string = '<SCHEMA_REGISTRY_NAME>'
231229
232230
// Pointers to existing resources from AIO deployment
233-
resource customLocation 'Microsoft.ExtendedLocation/customLocations@2021-08-31-preview' existing = {
234-
name: customLocationName
235-
}
236-
resource aioInstance 'Microsoft.IoTOperations/instances@2024-08-15-preview' existing = {
237-
name: aioInstanceName
238-
}
239231
resource schemaRegistry 'Microsoft.DeviceRegistry/schemaRegistries@2024-09-01-preview' existing = {
240232
name: schemaRegistryName
241233
}
@@ -268,9 +260,7 @@ resource opcuaSchemaVersion 'Microsoft.DeviceRegistry/schemaRegistries/schemas/s
268260
After you've defined the schema content and resources, you can deploy the Bicep template to create the schema in the schema registry.
269261

270262
```azurecli
271-
az stack group create --name <DEPLOYMENT_NAME> --resource-group
272-
<RESOURCE_GROUP> --template-file <FILE>.bicep --dm None --aou
273-
deleteResources --yes
263+
az deployment group create --resource-group <RESOURCE_GROUP> --template-file <FILE>.bicep
274264
```
275265

276266
## Next steps

0 commit comments

Comments
 (0)