Skip to content

Commit 2a1f171

Browse files
committed
Merge branch 'release-aio-m3' of https://github.com/MicrosoftDocs/azure-docs-pr into patricka-doe-release-aio-m3
2 parents 6bc6911 + b9b8fd3 commit 2a1f171

24 files changed

+1200
-426
lines changed

articles/iot-operations/connect-to-cloud/concept-dataflow-conversions.md

Lines changed: 100 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ author: PatAltimore
55
ms.author: patricka
66
ms.subservice: azure-data-flows
77
ms.topic: concept-article
8-
ms.date: 08/03/2024
8+
ms.date: 10/30/2024
99

1010
#CustomerIntent: As an operator, I want to understand how to use dataflow conversions to transform data.
1111
ms.service: azure-iot-operations
@@ -19,6 +19,19 @@ You can use dataflow conversions to transform data in Azure IoT Operations. The
1919

2020
The dataflow conversion element is used to compute values for output fields:
2121

22+
# [Bicep](#tab/bicep)
23+
24+
```bicep
25+
inputs: [
26+
'*.Max' // - $1
27+
'*.Min' // - $2
28+
]
29+
output: 'ColorProperties.*'
30+
expression: '($1 + $2) / 2'
31+
```
32+
33+
# [Kubernetes](#tab/kubernetes)
34+
2235
```yaml
2336
- inputs:
2437
- *.Max # - $1
@@ -27,6 +40,8 @@ The dataflow conversion element is used to compute values for output fields:
2740
expression: ($1 + $2) / 2
2841
```
2942
43+
---
44+
3045
There are several aspects to understand about conversions:
3146
3247
* **Reference to input fields:** How to reference values from input fields in the conversion formula.
@@ -38,6 +53,21 @@ There are several aspects to understand about conversions:
3853
3954
In conversions, formulas can operate on static values like a number such as *25* or parameters derived from input fields. A mapping defines these input fields that the formula can access. Each field is referenced according to its order in the input list:
4055
56+
# [Bicep](#tab/bicep)
57+
58+
```bicep
59+
inputs: [
60+
'*.Max' // - $1
61+
'*.Min' // - $2
62+
'*.Mid.Avg' // - $3
63+
'*.Mid.Mean' // - $4
64+
]
65+
output: 'ColorProperties.*'
66+
expression: '($1, $2, $3, $4)'
67+
```
68+
69+
# [Kubernetes](#tab/kubernetes)
70+
4171
```yaml
4272
- inputs:
4373
- *.Max # - $1
@@ -48,6 +78,8 @@ In conversions, formulas can operate on static values like a number such as *25*
4878
expression: ($1, $2, $3, $4)
4979
```
5080
81+
---
82+
5183
In this example, the conversion results in an array containing the values of `[Max, Min, Mid.Avg, Mid.Mean]`. The comments in the YAML file (`# - $1`, `# - $2`) are optional, but they help to clarify the connection between each field property and its role in the conversion formula.
5284

5385
## Data types
@@ -125,32 +157,75 @@ Arrays can be processed by using aggregation functions to compute a single value
125157

126158
```json
127159
{
128-
"Measurements": [2.34, 12.3, 32.4]
160+
"Measurements": [2.34, 12.3, 32.4]
129161
}
130162
```
131163

132164
With the mapping:
133165

166+
# [Bicep](#tab/bicep)
167+
168+
```bicep
169+
inputs: [
170+
'Measurements' // - $1
171+
]
172+
output: 'Measurement'
173+
expression: 'min($1)'
174+
```
175+
176+
# [Kubernetes](#tab/kubernetes)
177+
134178
```yaml
135179
- inputs:
136180
- Measurements # - $1
137181
output: Measurement
138182
expression: min($1)
139183
```
140184

185+
---
186+
141187
This configuration selects the smallest value from the `Measurements` array for the output field.
142188

143189
It's also possible to use functions that result in a new array:
144190

191+
# [Bicep](#tab/bicep)
192+
193+
```bicep
194+
inputs: [
195+
'Measurements' // - $1
196+
]
197+
output: 'Measurements'
198+
expression: 'take($1, 10)' // taking at max 10 items
199+
```
200+
201+
# [Kubernetes](#tab/kubernetes)
202+
145203
```yaml
146204
- inputs:
147205
- Measurements # - $1
148206
output: Measurements
149207
expression: take($1, 10) # taking at max 10 items
150208
```
151209

210+
---
211+
152212
Arrays can also be created from multiple single values:
153213

214+
# [Bicep](#tab/bicep)
215+
216+
```bicep
217+
inputs: [
218+
'minimum' // - - $1
219+
'maximum' // - - $2
220+
'average' // - - $3
221+
'mean' // - - $4
222+
]
223+
output: 'stats'
224+
expression: '($1, $2, $3, $4)'
225+
```
226+
227+
# [Kubernetes](#tab/kubernetes)
228+
154229
```yaml
155230
- inputs:
156231
- minimum # - - $1
@@ -161,6 +236,8 @@ Arrays can also be created from multiple single values:
161236
expression: ($1, $2, $3, $4)
162237
```
163238

239+
---
240+
164241
This mapping creates an array that contains the minimum, maximum, average, and mean.
165242

166243
### Missing value
@@ -174,11 +251,11 @@ Example mapping that uses a missing value:
174251

175252
```json
176253
{
177-
"Employment": {
178-
"Position": "Analyst",
179-
"BaseSalary": 75000,
180-
"WorkingHours": "Regular"
181-
}
254+
"Employment": {
255+
"Position": "Analyst",
256+
"BaseSalary": 75000,
257+
"WorkingHours": "Regular"
258+
}
182259
}
183260
```
184261

@@ -194,14 +271,29 @@ The input record contains the `BaseSalary` field, but possibly that's optional.
194271

195272
A mapping can check if the field is present in the input record. If the field is found, the output receives that existing value. Otherwise, the output receives the value from the context dataset. For example:
196273

274+
# [Bicep](#tab/bicep)
275+
276+
```bicep
277+
inputs: [
278+
'BaseSalary' // - - - - - - - - - - - $1
279+
'$context(position).BaseSalary' // - $2
280+
]
281+
output: 'BaseSalary'
282+
expression: 'if($1 == (), $2, $1)'
283+
```
284+
285+
# [Kubernetes](#tab/kubernetes)
286+
197287
```yaml
198288
- inputs:
199289
- BaseSalary # - - - - - - - - - - $1
200-
- $context(position).BaseSalary # - $2
290+
- $context(position).BaseSalary # - $2
201291
output: BaseSalary
202292
expression: if($1 == (), $2, $1)
203293
```
204294

295+
---
296+
205297
The `conversion` uses the `if` function that has three parameters:
206298

207299
* The first parameter is a condition. In the example, it checks if the `BaseSalary` field of the input field (aliased as `$1`) is the missing value.

articles/iot-operations/connect-to-cloud/concept-dataflow-enrich.md

Lines changed: 104 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ author: PatAltimore
55
ms.author: patricka
66
ms.subservice: azure-data-flows
77
ms.topic: concept-article
8-
ms.date: 08/13/2024
8+
ms.date: 10/30/2024
99

1010
#CustomerIntent: As an operator, I want to understand how to create a dataflow to enrich data sent to endpoints.
1111
ms.service: azure-iot-operations
@@ -34,20 +34,60 @@ For example, consider the following dataset with a few records, represented as J
3434

3535
The mapper accesses the reference dataset stored in the Azure IoT Operations [distributed state store (DSS)](../create-edge-apps/concept-about-state-store-protocol.md) by using a key value based on a *condition* specified in the mapping configuration. Key names in the DSS correspond to a dataset in the dataflow configuration.
3636

37+
# [Bicep](#tab/bicep)
38+
39+
```bicep
40+
datasets: [
41+
{
42+
key: 'position',
43+
inputs: [
44+
'$source.Position' // - $1
45+
'$context.Position' // - $2
46+
],
47+
expression: '$1 == $2'
48+
}
49+
]
50+
```
51+
52+
# [Kubernetes](#tab/kubernetes)
53+
3754
```yaml
3855
datasets:
3956
- key: position
4057
inputs:
41-
- $source.Position # - $1
42-
- $context.Position # -$2
58+
- $source.Position # - $1
59+
- $context.Position # - $2
4360
expression: $1 == $2
4461
```
4562
63+
---
64+
4665
When a new record is being processed, the mapper performs the following steps:
4766
4867
* **Data request:** The mapper sends a request to the DSS to retrieve the dataset stored under the key `Position`.
4968
* **Record matching:** The mapper then queries this dataset to find the first record where the `Position` field in the dataset matches the `Position` field of the incoming record.
5069

70+
# [Bicep](#tab/bicep)
71+
72+
```bicep
73+
{
74+
inputs: [
75+
'$context(position).WorkingHours' // - $1
76+
]
77+
output: 'WorkingHours'
78+
}
79+
{
80+
inputs: [
81+
'BaseSalary' // - - - - - - - - - - - - $1
82+
'$context(position).BaseSalary' // - - $2
83+
]
84+
output: 'BaseSalary'
85+
expression: 'if($1 == (), $2, $1)'
86+
}
87+
```
88+
89+
# [Kubernetes](#tab/kubernetes)
90+
5191
```yaml
5292
- inputs:
5393
- $context(position).WorkingHours # - $1
@@ -60,10 +100,37 @@ When a new record is being processed, the mapper performs the following steps:
60100
expression: if($1 == (), $2, $1)
61101
```
62102

103+
---
104+
63105
In this example, the `WorkingHours` field is added to the output record, while the `BaseSalary` is used conditionally only when the incoming record doesn't contain the `BaseSalary` field (or the value is `null` if it's a nullable field). The request for the contextualization data doesn't happen with every incoming record. The mapper requests the dataset and then it receives notifications from DSS about the changes, while it uses a cached version of the dataset.
64106

65107
It's possible to use multiple datasets:
66108

109+
# [Bicep](#tab/bicep)
110+
111+
```bicep
112+
datasets: [
113+
{
114+
key: 'position'
115+
inputs: [
116+
'$source.Position' // - $1
117+
'$context.Position' // - $2
118+
],
119+
expression: '$1 == $2'
120+
}
121+
{
122+
key: 'permissions'
123+
inputs: [
124+
'$source.Position' // - $1
125+
'$context.Position' // - $2
126+
],
127+
expression: '$1 == $2'
128+
}
129+
]
130+
```
131+
132+
# [Kubernetes](#tab/kubernetes)
133+
67134
```yaml
68135
datasets:
69136
- key: position
@@ -79,16 +146,48 @@ datasets:
79146
expression: $1 == $2
80147
```
81148

149+
---
150+
82151
Then use the references mixed:
83152

153+
# [Bicep](#tab/bicep)
154+
155+
```bicep
156+
inputs: [
157+
'$context(position).WorkingHours' // - $1
158+
'$context(permissions).NightShift' // - $2
159+
]
160+
```
161+
162+
# [Kubernetes](#tab/kubernetes)
163+
84164
```yaml
85165
- inputs:
86166
- $context(position).WorkingHours # - - $1
87167
- $context(permission).NightShift # - - $2
88168
```
89169

170+
---
171+
90172
The input references use the key of the dataset like `position` or `permission`. If the key in DSS is inconvenient to use, you can define an alias:
91173

174+
# [Bicep](#tab/bicep)
175+
176+
```bicep
177+
datasets: [
178+
{
179+
key: 'datasets.parag10.rule42 as position'
180+
inputs: [
181+
'$source.Position' // - $1
182+
'$context.Position' // - $2
183+
],
184+
expression: '$1 == $2'
185+
}
186+
]
187+
```
188+
189+
# [Kubernetes](#tab/kubernetes)
190+
92191
```yaml
93192
datasets:
94193
- key: datasets.parag10.rule42 as position
@@ -98,4 +197,6 @@ datasets:
98197
expression: $1 == $2
99198
```
100199

200+
---
201+
101202
The configuration renames the dataset with the key `datasets.parag10.rule42` to `position`.

0 commit comments

Comments
 (0)