You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Dataflows allow you to connect various data sources and perform data operations, simplifying the setup of data paths to move, transform, and enrich data. The dataflow component is part of Azure IoT Operations, deployed as an Arc-extension. The configuration for a dataflow is done via Kubernetes Custom Resource Definitions (CRDs).
17
+
Data flows allow you to connect various data sources and perform data operations, simplifying the setup of data paths to move, transform, and enrich data. The data flow component is part of Azure IoT Operations, deployed as an Arc-extension. The configuration for a data flow is done via Kubernetes Custom Resource Definitions (CRDs).
18
18
19
19
You can write configurations for various use cases, such as:
20
20
21
21
- Transform data and send it back to MQTT
22
22
- Transform data and send it to the cloud
23
23
- Send data to the cloud or edge without transformation
24
24
25
-
## Key Features
25
+
Data flows are not limited to the region where the Azure IoT Operations instance is deployed. You can use data flows to send data to any cloud endpoint in any region.
26
26
27
-
### Data Processing and Routing
27
+
##Key features
28
28
29
-
Dataflows enable the ingestion, processing, and the routing of the messages to specified sinks. You can specify:
29
+
### Data processing and routing
30
+
31
+
Data flows enable the ingestion, processing, and the routing of the messages to specified sinks. You can specify:
30
32
31
33
-**Sources:** Where messages are ingested from
32
34
-**Destinations:** Where messages are drained to
33
35
-**Transformations (optional):** Configuration for data processing operations
34
36
35
-
### Transformation Capabilities
37
+
### Transformation capabilities
36
38
37
39
Transformations can be applied to data during the processing stage to perform various operations. These operations can include:
38
40
@@ -42,9 +44,9 @@ Transformations can be applied to data during the processing stage to perform va
42
44
-**Standardize values:** Scale property values to a user-defined range
43
45
-**Contextualize data:** Add reference data to messages for enrichment and driving insights
44
46
45
-
### Configuration and Deployment
47
+
### Configuration and deployment
46
48
47
-
The configuration is specified using Kubernetes CRDs. Based on this configuration, the dataflow operator creates dataflow instances, ensuring high availability and reliability.
49
+
The configuration is specified using Kubernetes CRDs. Based on this configuration, the data flow operator creates data flow instances, ensuring high availability and reliability.
48
50
49
51
## Benefits
50
52
@@ -53,10 +55,10 @@ The configuration is specified using Kubernetes CRDs. Based on this configuratio
53
55
-**Scalable Configuration:** Use Kubernetes CRDs for scalable and manageable configurations
By using dataflows, you can efficiently manage your data paths, ensuring data is accurately sent, transformed, and enriched to meet your operational needs.
58
+
By using data flows, you can efficiently manage your data paths, ensuring data is accurately sent, transformed, and enriched to meet your operational needs.
57
59
58
60
## Related content
59
61
60
-
-[Quickstart: Send asset telemetry to the cloud using a dataflow](../get-started-end-to-end-sample/quickstart-upload-telemetry-to-cloud.md)
61
-
-[Create a dataflow](howto-create-dataflow.md)
62
-
-[Create a dataflow endpoint](howto-configure-dataflow-endpoint.md)
62
+
-[Quickstart: Send asset telemetry to the cloud using a data flow](../get-started-end-to-end-sample/quickstart-upload-telemetry-to-cloud.md)
63
+
-[Create a data flow](howto-create-dataflow.md)
64
+
-[Create a data flow endpoint](howto-configure-dataflow-endpoint.md)
0 commit comments