Skip to content

Commit 3f1b715

Browse files
committed
Formatting
1 parent fa391eb commit 3f1b715

File tree

1 file changed

+25
-25
lines changed

1 file changed

+25
-25
lines changed

articles/iot-operations/connect-to-cloud/concept-dataflow-language.md

Lines changed: 25 additions & 25 deletions
Original file line numberDiff line numberDiff line change
@@ -87,24 +87,24 @@ The example maps:
8787

8888
Field references show how to specify paths in the input and output, using dot notation like `Employee.DateOfBirth` or accessing data from a contextual dataset via `$context(position)`.
8989

90-
## Contextualization Dataset Selectors
90+
## Contextualization dataset selectors
9191

92-
These selectors allow mappings to integrate additional data from external databases, referred to as *Contextualization Datasets*.
92+
These selectors allow mappings to integrate additional data from external databases, referred to as *contextualization datasets*.
9393

94-
## Record Filtering
94+
## Record filtering
9595

9696
Record filtering involves setting conditions to select which records should be processed or dropped.
9797

9898
## Dot-notation
9999

100-
Dot-notation is widely used in computer science to reference fields, even recursively. In programming, field names typically consist of letters and numbers, so a standard dot-notation might look like this:
100+
Dot-notation is widely used in computer science to reference fields, even recursively. In programming, field names typically consist of letters and numbers. A standard dot-notation might look like this:
101101

102102
```yaml
103103
- inputs:
104104
- Person.Address.Street.Number
105105
```
106106

107-
However, in dataflow a path described by dot-notation might include strings and some special characters without needing escaping:
107+
However, in a dataflow a path described by dot-notation might include strings and some special characters without needing escaping:
108108

109109
```yaml
110110
- inputs:
@@ -127,7 +127,7 @@ While dataflow parses a path, it treats only two characters as special:
127127

128128
Any other characters are treated as part of the field name. This flexibility is useful in formats like JSON, where field names can be arbitrary strings.
129129

130-
Note, however, that the path definition must also adhere to the rules of YAML: once a character with special meaning is included in the path, proper quoting is required in the configuration. Consult the YAML documentation for precise rules. Here are some examples that demonstrate the need for careful formatting:
130+
Note that the path definition must also adhere to the rules of YAML. Once a character with special meaning is included in the path, proper quoting is required in the configuration. Consult the YAML documentation for precise rules. Here are some examples that demonstrate the need for careful formatting:
131131

132132
```yaml
133133
- inputs:
@@ -146,7 +146,7 @@ The primary function of escaping in a dot-notated path is to accommodate the use
146146

147147
In the previous example, the path consists of three segments: `Payload`, `Tag.10`, and `Value`. The outer single quotes (`'`) are necessary because of YAML syntax rules, allowing the inclusion of double quotes within the string.
148148

149-
**Escaping Rules in Dot-Notation:**
149+
### Escaping rules in dot notation
150150

151151
* **Escape Each Segment Separately**: If multiple segments contain dots, those segments must be enclosed in quotes. Other segments can also be quoted, but it doesn't affect the path interpretation:
152152

@@ -171,7 +171,7 @@ This example defines two fields in the dataDestination: `Payload` and `He said:
171171

172172
In this case, the path is split into the segments `Payload`, `He said: "No`, and `It's done"` (starting with a space).
173173
174-
**Segmentation Algorithm**:
174+
### Segmentation algorithm
175175
176176
* If the first character of a segment is a quote, the parser searches for the next quote. The string enclosed between these quotes is considered a single segment.
177177
* If the segment doesn't start with a quote, the parser identifies segments by searching for the next dot or the end of the path.
@@ -197,9 +197,9 @@ Here's how the asterisk (`*`) operates in this context:
197197

198198
This configuration demonstrates the most generic form of mapping, where every field in the input is directly mapped to a corresponding field in the output without modification.
199199

200-
Another example illustrates how wildcards can be used to match sub-sections and move them together. This example effectively flattens nested structures within a JSON object:
200+
Another example illustrates how wildcards can be used to match sub-sections and move them together. This example effectively flattens nested structures within a JSON object.
201201

202-
*Original JSON:*
202+
Original JSON:
203203

204204
```json
205205
{
@@ -218,7 +218,7 @@ Another example illustrates how wildcards can be used to match sub-sections and
218218
}
219219
```
220220

221-
*Mapping Configuration Using Wildcards:*
221+
Mapping Configuration Using Wildcards:
222222

223223
```yaml
224224
- inputs:
@@ -230,7 +230,7 @@ Another example illustrates how wildcards can be used to match sub-sections and
230230
output: *
231231
```
232232

233-
*Resulting JSON:*
233+
Resulting JSON:
234234

235235
```json
236236
{
@@ -256,7 +256,7 @@ When placing a wildcard, the following rules must be followed:
256256
* **In the Middle:** `path1.*.path3` - In this configuration, the asterisk matches any segment between `path1` and `path3`.
257257
* **At the End:** `path1.path2.*` - The asterisk at the end matches any segment that follows after `path1.path2`.
258258

259-
### Multi-Input Wildcards
259+
### Multi-input wildcards
260260

261261
*Original JSON:*
262262

@@ -277,7 +277,7 @@ When placing a wildcard, the following rules must be followed:
277277
}
278278
```
279279

280-
*Mapping Configuration Using Wildcards:*
280+
Mapping Configuration Using wildcards:
281281

282282
```yaml
283283
- inputs:
@@ -287,7 +287,7 @@ When placing a wildcard, the following rules must be followed:
287287
conversion: ($1 + $2) / 2
288288
```
289289

290-
*Resulting JSON:*
290+
Resulting JSON:
291291

292292
```json
293293
{
@@ -303,7 +303,7 @@ If multi-input wildcards, the asterisk (`*`) must consistently represent the sam
303303

304304
Consider this detailed example:
305305

306-
**Original JSON:**
306+
Original JSON:
307307

308308
```json
309309
{
@@ -334,7 +334,7 @@ Consider this detailed example:
334334
}
335335
```
336336

337-
**Initial Mapping Configuration Using Wildcards:**
337+
Initial mapping configuration using wildcards:
338338

339339
```yaml
340340
- inputs:
@@ -357,7 +357,7 @@ This initial mapping tries to build an array (For example, for `Opacity`: `[0.88
357357

358358
Since `Avg` and `Mean` are nested within `Mid`, the asterisk in the initial mapping doesn't correctly capture these paths.
359359

360-
**Corrected Mapping Configuration:**
360+
Corrected mapping configuration:
361361

362362
```yaml
363363
- inputs:
@@ -371,9 +371,9 @@ Since `Avg` and `Mean` are nested within `Mid`, the asterisk in the initial mapp
371371

372372
This revised mapping accurately captures the necessary fields by correctly specifying the paths to include the nested `Mid` object, ensuring that the asterisks work effectively across different levels of the JSON structure.
373373

374-
### Second Rule vs. Specialization
374+
### Second rule versus specialization
375375

376-
Using the example from Multi-Input Wildcards, consider the following mappings that generate two derived values for each property:
376+
Using the previous example from multi-input wildcards, consider the following mappings that generate two derived values for each property:
377377

378378
```yaml
379379
- inputs:
@@ -410,7 +410,7 @@ This mapping is intended to create two separate calculations (`Avg` and `Diff`)
410410
}
411411
```
412412

413-
Here, the second mapping definition on the same inputs acts as a `Second Rule` for mapping.
413+
Here, the second mapping definition on the same inputs acts as a *second rule* for mapping.
414414

415415
Now, consider a scenario where a specific field needs a different calculation:
416416

@@ -450,7 +450,7 @@ Consider a special case for the same fields to help deciding the right action:
450450

451451
An empty `output` field in the second definition implies not writing the fields in the output record (effectively removing `Opacity`). This setup is more of a `Specialization` than a `Second Rule`.
452452

453-
**Resolution of Overlapping Mappings by dataflow:**
453+
Resolution of overlapping mappings by dataflow:
454454

455455
* The evaluation progresses from the top rule in the mapping definition.
456456
* If a new mapping resolves to the same fields as a previous rule, the following applies:
@@ -543,7 +543,7 @@ Conversions use simple math formulas similar to those learned in middle school.
543543

544544
For more complex calculations, functions like `sqrt` (which finds the square root of a number) are also available.
545545

546-
**Available arithmetic, comparison, and boolean operators grouped by precedence:**
546+
### Available arithmetic, comparison, and boolean operators grouped by precedence
547547

548548
| Operator | Description |
549549
|----------|-------------|
@@ -654,7 +654,7 @@ The mapper is designed to be flexible, converting internal types into output typ
654654

655655
While the automatic conversions generally operate as one might expect, based on common implementation practices, there are instances where the right conversion cannot be determined automatically, resulting in an `unsupported` error. To address these situations, several conversion functions are available to explicitly define how data should be transformed. These functions provide more control over how data is converted and ensure that data integrity is maintained even when automatic methods fall short.
656656

657-
<!--**[FIXME - actually we don't have functions, need a list of what they usually want]**-->
657+
<!-- We don't have functions, need a list of what they usually want -->
658658

659659
### Using conversion formula with types
660660

@@ -778,7 +778,7 @@ There are numerous other functions available in different categories:
778778
* string manipulation (for example, `uppercase()`)
779779
* explicit conversion (for example, `ISO8601_datetime`)
780780
* aggregation (for example, `avg()`)
781-
* **[FIXME we actually don't have much methods, need a list about what to implement]**
781+
<!--- minimal methods, need a list about what to implement -->
782782

783783
## Enrich data from contextualization datasets
784784

0 commit comments

Comments
 (0)