Skip to content

Commit 31ea7bb

Browse files
Merge pull request #284463 from PatAltimore/patricka-dss-reference
Add DSS reference to dataflows
2 parents 8204af8 + e7ff254 commit 31ea7bb

File tree

2 files changed

+9
-5
lines changed

2 files changed

+9
-5
lines changed

articles/iot-operations/connect-to-cloud/concept-dataflow-enrich.md

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ author: PatAltimore
55
ms.author: patricka
66
ms.subservice: azure-data-flows
77
ms.topic: concept-article
8-
ms.date: 08/03/2024
8+
ms.date: 08/13/2024
99

1010
#CustomerIntent: As an operator, I want to understand how to create a dataflow to enrich data sent to endpoints.
1111
---
@@ -31,7 +31,9 @@ For example, consider the following dataset with a few records, represented as J
3131
}
3232
```
3333

34-
The mapper accesses this dataset through the *distributed state store* (DSS) using a key value based on a *condition* specified in the mapping configuration.
34+
35+
36+
The mapper accesses the reference dataset stored in Azure IoT Operations's [distributed state store (DSS)](../create-edge-apps/concept-about-state-store-protocol.md) using a key value based on a *condition* specified in the mapping configuration. Key names in the distributed state store correspond to a dataset in the dataflow configuration.
3537

3638
```yaml
3739
datasets:

articles/iot-operations/connect-to-cloud/howto-create-dataflow.md

Lines changed: 5 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ author: PatAltimore
55
ms.author: patricka
66
ms.subservice: azure-data-flows
77
ms.topic: how-to
8-
ms.date: 08/03/2024
8+
ms.date: 08/13/2024
99

1010
#CustomerIntent: As an operator, I want to understand how to create a dataflow to connect data sources.
1111
---
@@ -16,7 +16,7 @@ ms.date: 08/03/2024
1616

1717
A dataflow is the path that data takes from the source to the destination with optional transformations. You can configure the dataflow using the Azure IoT Operations portal or by creating a *Dataflow* custom resource. Before creating a dataflow, you must [configure dataflow endpoints for the data sources and destinations](howto-configure-dataflow-endpoint.md).
1818

19-
The following is an example of a dataflow configuration with an MQTT source endpoint, transformations, and a Kafka destination endpoint:
19+
The following example is a dataflow configuration with an MQTT source endpoint, transformations, and a Kafka destination endpoint:
2020

2121
```yaml
2222
apiVersion: connectivity.iotoperations.azure.com/v1beta1
@@ -128,13 +128,15 @@ spec:
128128

129129
### Enrich: Add reference data
130130

131-
To enrich the data, you can use a reference dataset in Azure IoT Operations's distributed state store (DSS). The dataset is used to add extra data to the source data based on a condition. The condition is specified as a field in the source data that matches a field in the dataset.
131+
To enrich the data, you can use the reference dataset stored in Azure IoT Operations's [distributed state store (DSS)](../create-edge-apps/concept-about-state-store-protocol.md). The dataset is used to add extra data to the source data based on a condition. The condition is specified as a field in the source data that matches a field in the dataset.
132132

133133
| Name | Description |
134134
|------------------------------------------------|-------------------------------------------|
135135
| builtInTransformationSettings.datasets.key | Dataset used for enrichment (key in DSS) |
136136
| builtInTransformationSettings.datasets.expression | Condition for the enrichment operation |
137137

138+
Key names in the distributed state store correspond to a dataset in the dataflow configuration.
139+
138140
For example, you could use the `deviceId` field in the source data to match the `asset` field in the dataset:
139141

140142
```yaml

0 commit comments

Comments
 (0)