Skip to content

Commit 7ddf33a

Browse files
authored
Merge pull request #290882 from dominicbetts/aio-rename-data-flow
AIO: Change dataflow to data flow
2 parents 4432620 + 3ee2fd2 commit 7ddf33a

File tree

43 files changed

+564
-527
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

43 files changed

+564
-527
lines changed

articles/iot-operations/connect-to-cloud/concept-dataflow-conversions.md

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -1,23 +1,23 @@
11
---
2-
title: Convert data by using dataflow conversions
3-
description: Learn about dataflow conversions for transforming data in Azure IoT Operations.
2+
title: Convert data by using data flow conversions
3+
description: Learn about data flow conversions for transforming data in Azure IoT Operations.
44
author: PatAltimore
55
ms.author: patricka
66
ms.subservice: azure-data-flows
77
ms.topic: concept-article
88
ms.date: 11/11/2024
99

10-
#CustomerIntent: As an operator, I want to understand how to use dataflow conversions to transform data.
10+
#CustomerIntent: As an operator, I want to understand how to use data flow conversions to transform data.
1111
ms.service: azure-iot-operations
1212
---
1313

14-
# Convert data by using dataflow conversions
14+
# Convert data by using data flow conversions
1515

1616
[!INCLUDE [kubernetes-management-preview-note](../includes/kubernetes-management-preview-note.md)]
1717

18-
You can use dataflow conversions to transform data in Azure IoT Operations. The *conversion* element in a dataflow is used to compute values for output fields. You can use input fields, available operations, data types, and type conversions in dataflow conversions.
18+
You can use data flow conversions to transform data in Azure IoT Operations. The *conversion* element in a data flow is used to compute values for output fields. You can use input fields, available operations, data types, and type conversions in data flow conversions.
1919

20-
The dataflow conversion element is used to compute values for output fields:
20+
The data flow conversion element is used to compute values for output fields:
2121

2222
# [Bicep](#tab/bicep)
2323

@@ -271,7 +271,7 @@ The `conversion` uses the `if` function that has three parameters:
271271

272272
## Available functions
273273

274-
Dataflows provide a set of built-in functions that can be used in conversion formulas. These functions can be used to perform common operations like arithmetic, comparison, and string manipulation. The available functions are:
274+
Data flows provide a set of built-in functions that can be used in conversion formulas. These functions can be used to perform common operations like arithmetic, comparison, and string manipulation. The available functions are:
275275

276276
| Function | Description | Examples |
277277
|----------|-------------|---------|
@@ -286,7 +286,7 @@ Dataflows provide a set of built-in functions that can be used in conversion for
286286

287287
### Conversion functions
288288

289-
Dataflows provide several built-in conversion functions for common unit conversions like temperature, pressure, length, weight, and volume. Here are some examples:
289+
Data flows provide several built-in conversion functions for common unit conversions like temperature, pressure, length, weight, and volume. Here are some examples:
290290

291291
| Conversion | Formula | Function name |
292292
| --- | --- | --- |

articles/iot-operations/connect-to-cloud/concept-dataflow-enrich.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,17 +1,17 @@
11
---
2-
title: Enrich data by using dataflows
3-
description: Use contextualization datasets to enrich data in Azure IoT Operations dataflows.
2+
title: Enrich data by using data flows
3+
description: Use contextualization datasets to enrich data in Azure IoT Operations data flows.
44
author: PatAltimore
55
ms.author: patricka
66
ms.subservice: azure-data-flows
77
ms.topic: concept-article
88
ms.date: 11/13/2024
99

10-
#CustomerIntent: As an operator, I want to understand how to create a dataflow to enrich data sent to endpoints.
10+
#CustomerIntent: As an operator, I want to understand how to create a data flow to enrich data sent to endpoints.
1111
ms.service: azure-iot-operations
1212
---
1313

14-
# Enrich data by using dataflows
14+
# Enrich data by using data flows
1515

1616
[!INCLUDE [kubernetes-management-preview-note](../includes/kubernetes-management-preview-note.md)]
1717

@@ -34,7 +34,7 @@ For example, consider the following dataset with a few records, represented as J
3434
}
3535
```
3636

37-
The mapper accesses the reference dataset stored in the Azure IoT Operations [state store](../create-edge-apps/concept-about-state-store-protocol.md) by using a key value based on a *condition* specified in the mapping configuration. Key names in the state store correspond to a dataset in the dataflow configuration.
37+
The mapper accesses the reference dataset stored in the Azure IoT Operations [state store](../create-edge-apps/concept-about-state-store-protocol.md) by using a key value based on a *condition* specified in the mapping configuration. Key names in the state store correspond to a dataset in the data flow configuration.
3838

3939
# [Bicep](#tab/bicep)
4040

articles/iot-operations/connect-to-cloud/concept-dataflow-mapping.md

Lines changed: 11 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -1,22 +1,22 @@
11
---
2-
title: Map data by using dataflows
3-
description: Learn about the dataflow mapping language for transforming data in Azure IoT Operations.
2+
title: Map data by using data flows
3+
description: Learn about the data flow mapping language for transforming data in Azure IoT Operations.
44
author: PatAltimore
55
ms.author: patricka
66
ms.subservice: azure-data-flows
77
ms.topic: concept-article
88
ms.date: 11/11/2024
99
ai-usage: ai-assisted
1010

11-
#CustomerIntent: As an operator, I want to understand how to use the dataflow mapping language to transform data.
11+
#CustomerIntent: As an operator, I want to understand how to use the data flow mapping language to transform data.
1212
ms.service: azure-iot-operations
1313
---
1414

15-
# Map data by using dataflows
15+
# Map data by using data flows
1616

1717
[!INCLUDE [kubernetes-management-preview-note](../includes/kubernetes-management-preview-note.md)]
1818

19-
Use the dataflow mapping language to transform data in Azure IoT Operations. The syntax is a simple, yet powerful, way to define mappings that transform data from one format to another. This article provides an overview of the dataflow mapping language and key concepts.
19+
Use the data flow mapping language to transform data in Azure IoT Operations. The syntax is a simple, yet powerful, way to define mappings that transform data from one format to another. This article provides an overview of the data flow mapping language and key concepts.
2020

2121
Mapping allows you to transform data from one format to another. Consider the following input record:
2222

@@ -127,7 +127,7 @@ When you use MQTT or Kafka as a source or destination, you can access various me
127127
* **Topic**: Works for both MQTT and Kafka. It contains the string where the message was published. Example: `$metadata.topic`.
128128
* **User property**: In MQTT, this refers to the free-form key/value pairs an MQTT message can carry. For example, if the MQTT message was published with a user property with key "priority" and value "high", then the `$metadata.user_property.priority` reference hold the value "high". User property keys can be arbitrary strings and may require escaping: `$metadata.user_property."weird key"` uses the key "weird key" (with a space).
129129
* **System property**: This term is used for every property that is not a user property. Currently, only a single system property is supported: `$metadata.system_property.content_type`, which reads the content type property of the MQTT message (if set).
130-
* **Header**: This is the Kafka equivalent of the MQTT user property. Kafka can use any binary value for a key, but dataflow supports only UTF-8 string keys. Example: `$metadata.header.priority`. This functionality is similar to user properties.
130+
* **Header**: This is the Kafka equivalent of the MQTT user property. Kafka can use any binary value for a key, but data flows support only UTF-8 string keys. Example: `$metadata.header.priority`. This functionality is similar to user properties.
131131

132132
#### Mapping metadata properties
133133

@@ -269,7 +269,7 @@ inputs: [
269269

270270
---
271271

272-
In a dataflow, a path described by dot notation might include strings and some special characters without needing escaping:
272+
In a data flow, a path described by dot notation might include strings and some special characters without needing escaping:
273273

274274
# [Bicep](#tab/bicep)
275275

@@ -309,7 +309,7 @@ inputs: [
309309

310310
The previous example, among other special characters, contains dots within the field name. Without escaping, the field name would serve as a separator in the dot notation itself.
311311

312-
While a dataflow parses a path, it treats only two characters as special:
312+
While a data flow parses a path, it treats only two characters as special:
313313

314314
* Dots (`.`) act as field separators.
315315
* Single quotation marks, when placed at the beginning or the end of a segment, start an escaped section where dots aren't treated as field separators.
@@ -850,13 +850,13 @@ Consider a special case for the same fields to help decide the right action:
850850

851851
An empty `output` field in the second definition implies not writing the fields in the output record (effectively removing `Opacity`). This setup is more of a `Specialization` than a `Second Rule`.
852852

853-
Resolution of overlapping mappings by dataflows:
853+
Resolution of overlapping mappings by data flows:
854854

855855
* The evaluation progresses from the top rule in the mapping definition.
856856
* If a new mapping resolves to the same fields as a previous rule, the following conditions apply:
857857
* A `Rank` is calculated for each resolved input based on the number of segments the wildcard captures. For instance, if the `Captured Segments` are `Properties.Opacity`, the `Rank` is 2. If it's only `Opacity`, the `Rank` is 1. A mapping without wildcards has a `Rank` of 0.
858-
* If the `Rank` of the latter rule is equal to or higher than the previous rule, a dataflow treats it as a `Second Rule`.
859-
* Otherwise, the dataflow treats the configuration as a `Specialization`.
858+
* If the `Rank` of the latter rule is equal to or higher than the previous rule, a data flow treats it as a `Second Rule`.
859+
* Otherwise, the data flow treats the configuration as a `Specialization`.
860860

861861
For example, the mapping that directs `Opacity.Max` and `Opacity.Min` to an empty output has a `Rank` of 0. Because the second rule has a lower `Rank` than the previous one, it's considered a specialization and overrides the previous rule, which would calculate a value for `Opacity`.
862862

articles/iot-operations/connect-to-cloud/concept-schema-registry.md

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
---
22
title: Understand message schemas
3-
description: Learn how schema registry handles message schemas to work with Azure IoT Operations components including dataflows.
3+
description: Learn how schema registry handles message schemas to work with Azure IoT Operations components including data flows.
44
author: kgremban
55
ms.author: kgremban
66
ms.topic: conceptual
@@ -25,7 +25,7 @@ Schema registry expects the following required fields in a message schema:
2525

2626
| Required field | Definition |
2727
| -------------- | ---------- |
28-
| `$schema` | Either `http://json-schema.org/draft-07/schema#` or `Delta/1.0`. In dataflows, JSON schemas are used for source endpoints and Delta schemas are used for destination endpoints. |
28+
| `$schema` | Either `http://json-schema.org/draft-07/schema#` or `Delta/1.0`. In data flows, JSON schemas are used for source endpoints and Delta schemas are used for destination endpoints. |
2929
| `type` | `Object` |
3030
| `properties` | The message definition. |
3131

@@ -87,13 +87,13 @@ To generate the schema from a sample data file, use the [Schema Gen Helper](http
8787

8888
For a tutorial that uses the schema generator, see [Tutorial: Send data from an OPC UA server to Azure Data Lake Storage Gen 2](./tutorial-opc-ua-to-data-lake.md).
8989

90-
## How dataflows use message schemas
90+
## How data flows use message schemas
9191

92-
Message schemas are used in all three phases of a dataflow: defining the source input, applying data transformations, and creating the destination output.
92+
Message schemas are used in all three phases of a data flow: defining the source input, applying data transformations, and creating the destination output.
9393

9494
### Input schema
9595

96-
Each dataflow source can optionally specify a message schema. Currently, dataflows doesn't perform runtime validation on source message schemas.
96+
Each data flow source can optionally specify a message schema. Currently, data flows doesn't perform runtime validation on source message schemas.
9797

9898
Asset sources have a predefined message schema that was created by the connector for OPC UA.
9999

@@ -107,7 +107,7 @@ The operations experience uses the input schema as a starting point for your dat
107107

108108
### Output schema
109109

110-
Output schemas are associated with dataflow destinations.
110+
Output schemas are associated with data flow destinations.
111111

112112
In the operations experience portal, you can configure output schemas for the following destination endpoints that support Parquet output:
113113

@@ -120,7 +120,7 @@ Note: The Delta schema format is used for both Parquet and Delta output.
120120

121121
If you use Bicep or Kubernetes, you can configure output schemas using JSON output for MQTT and Kafka destination endpoints. MQTT- and Kafka-based destinations don't support Delta format.
122122

123-
For these dataflows, the operations experience applies any transformations to the input schema then creates a new schema in Delta format. When the dataflow custom resource (CR) is created, it includes a `schemaRef` value that points to the generated schema stored in the schema registry.
123+
For these data flows, the operations experience applies any transformations to the input schema then creates a new schema in Delta format. When the data flow custom resource (CR) is created, it includes a `schemaRef` value that points to the generated schema stored in the schema registry.
124124

125125
To upload an output schema, see [Upload schema](#upload-schema).
126126

@@ -309,4 +309,4 @@ az deployment group create --resource-group <RESOURCE_GROUP> --template-file <FI
309309

310310
## Next steps
311311

312-
- [Create a dataflow](howto-create-dataflow.md)
312+
- [Create a data flow](howto-create-dataflow.md)

0 commit comments

Comments
 (0)