You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/iot-operations/connect-to-cloud/howto-create-dataflow.md
+11-11Lines changed: 11 additions & 11 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -536,7 +536,7 @@ sourceSettings:
536
536
537
537
---
538
538
539
-
If the instance count in the [data flow profile](howto-configure-dataflow-profile.md) is greater than one, shared subscription is automatically enabled for all data flows that use a message broker source. In this case, the `$shared` prefix is added and the shared subscription group name automatically generated. For example, if you have a data flow profile with an instance count of 3, and your data flow uses a message broker endpoint as source configured with topics `topic1` and `topic2`, they are automatically converted to shared subscriptions as `$shared/<GENERATED_GROUP_NAME>/topic1` and `$shared/<GENERATED_GROUP_NAME>/topic2`.
539
+
If the instance count in the [data flow profile](howto-configure-dataflow-profile.md) is greater than one, shared subscription is automatically enabled for all data flows that use a message broker source. In this case, the `$shared` prefix is added and the shared subscription group name automatically generated. For example, if you have a data flow profile with an instance count of 3, and your data flow uses a message broker endpoint as source configured with topics `topic1` and `topic2`, they're automatically converted to shared subscriptions as `$shared/<GENERATED_GROUP_NAME>/topic1` and `$shared/<GENERATED_GROUP_NAME>/topic2`.
540
540
541
541
You can explicitly create a topic named `$shared/mygroup/topic` in your configuration. However, adding the `$shared` topic explicitly isn't recommended since the `$shared` prefix is automatically added when needed. Data flows can make optimizations with the group name if it isn't set. For example, `$share` isn't set and data flows only has to operate over the topic name.
542
542
@@ -545,7 +545,7 @@ You can explicitly create a topic named `$shared/mygroup/topic` in your configur
545
545
546
546
#### Kafka topics
547
547
548
-
When the source is a Kafka (Event Hubs included) endpoint, specify the individual Kafka topics to subscribe to for incoming messages. Wildcards are not supported, so you must specify each topic statically.
548
+
When the source is a Kafka (Event Hubs included) endpoint, specify the individual Kafka topics to subscribe to for incoming messages. Wildcards aren't supported, so you must specify each topic statically.
549
549
550
550
> [!NOTE]
551
551
> When using Event Hubs via the Kafka endpoint, each individual event hub within the namespace is the Kafka topic. For example, if you have an Event Hubs namespace with two event hubs, `thermostats` and `humidifiers`, you can specify each event hub as a Kafka topic.
@@ -631,7 +631,7 @@ In operations experience data flow **Source details**, select **Message broker**
631
631
632
632
# [Bicep](#tab/bicep)
633
633
634
-
Once you have used the [schema registry to store the schema](concept-schema-registry.md), you can reference it in the data flow configuration.
634
+
Once you use the [schema registry to store the schema](concept-schema-registry.md), you can reference it in the data flow configuration.
635
635
636
636
```bicep
637
637
sourceSettings: {
@@ -642,7 +642,7 @@ sourceSettings: {
642
642
643
643
# [Kubernetes (preview)](#tab/kubernetes)
644
644
645
-
Once you have used the [schema registry to store the schema](concept-schema-registry.md), you can reference it in the data flow configuration.
645
+
Once you use the [schema registry to store the schema](concept-schema-registry.md), you can reference it in the data flow configuration.
646
646
647
647
```yaml
648
648
sourceSettings:
@@ -656,20 +656,20 @@ To learn more, see [Understand message schemas](concept-schema-registry.md).
656
656
657
657
## Request disk persistence (preview)
658
658
659
-
Request disk persistence allows data flows to maintain state across restarts. When you enable this feature, the graph can recover processing state if connected broker restarts. This feature is useful for stateful processing scenarios where losing intermediate data would be problematic. When you enable request disk persistence, the broker persists the MQTT data, like messages in the subscriber queue, to disk. This approach ensures that your data flow's data source doesn't experience data loss during power outages or broker restarts. The broker maintains optimal performance because persistence is configured per data flow, so only the data flows that need persistence use this feature.
659
+
Request disk persistence lets data flows keep state across restarts. When you enable this feature, the graph recovers processing state if the connected broker restarts. This feature is useful for stateful processing scenarios where losing intermediate data is a problem. When you enable request disk persistence, the broker persists the MQTT data, like messages in the subscriber queue, to disk. This approach makes sure your data flow's data source doesn't lose data during power outages or broker restarts. The broker maintains optimal performance because persistence is configured per data flow, so only the data flows that need persistence use this feature.
660
660
661
-
The data flow graph makes this persistence request during subscription using an MQTTv5 user property. This feature only works when:
661
+
The data flow graph requests this persistence during subscription by using an MQTTv5 user property. This feature works only when:
662
662
663
-
- The data flow uses the MQTT broker or asset as source
663
+
- The data flow uses the MQTT broker or asset as the source
664
664
- The MQTT broker has persistence enabled with dynamic persistence mode set to `Enabled` for the data type, like subscriber queues
665
665
666
-
This configuration allows MQTT clients like data flows to request disk persistence for their subscriptions using MQTTv5 user properties. For detailed MQTT broker persistence configuration, see [Configure MQTT broker persistence](../manage-mqtt-broker/howto-broker-persistence.md).
666
+
This configuration lets MQTT clients like data flows request disk persistence for their subscriptions by using MQTTv5 user properties. For details about MQTT broker persistence configuration, see [Configure MQTT broker persistence](../manage-mqtt-broker/howto-broker-persistence.md).
667
667
668
-
The setting accepts `Enabled` or `Disabled`, with `Disabled` as the default.
668
+
The setting accepts `Enabled` or `Disabled`. `Disabled` is the default.
669
669
670
670
# [Operations experience](#tab/portal)
671
671
672
-
When creating or editing a data flow, select **Edit**, then check **Yes** next to **Request data persistence**.
672
+
When you create or edit a data flow, select **Edit**, and then select **Yes** next to **Request data persistence**.
673
673
674
674
# [Azure CLI](#tab/cli)
675
675
@@ -1522,7 +1522,7 @@ For MQTT endpoints, you can use dynamic topic variables in the `dataDestination`
1522
1522
1523
1523
For example, `processed/factory/${inputTopic.2}` routes messages from `factory/1/data` to `processed/factory/1`. Topic segments are 1-indexed, and leading/trailing slashes are ignored.
1524
1524
1525
-
If a topic variable cannot be resolved (for example, `${inputTopic.5}` when the input topic only has 3 segments), the message is dropped and a warning is logged. Wildcard characters (`#` and `+`) are not allowed in destination topics.
1525
+
If a topic variable cannot be resolved (for example, `${inputTopic.5}` when the input topic only has three segments), the message is dropped and a warning is logged. Wildcard characters (`#` and `+`) aren't allowed in destination topics.
1526
1526
1527
1527
> [!NOTE]
1528
1528
> The characters `$`, `{`, and `}` are valid in MQTT topic names, so a topic like `factory/$inputTopic.2` is acceptable but incorrect if you intended to use the dynamic topic variable.
Copy file name to clipboardExpand all lines: articles/iot-operations/connect-to-cloud/howto-dataflow-graph-wasm.md
+5-5Lines changed: 5 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -210,11 +210,11 @@ For detailed instructions, see [Assign Azure roles using the Azure portal](/azur
210
210
211
211
## Example 1: Basic deployment with one WASM module
212
212
213
-
This example converts temperature data from Fahrenheit to Celsius using a WASM module. The [temperature module source code](https://github.com/Azure-Samples/explore-iot-operations/tree/wasm/samples/wasm/operators/temperature) is available on GitHub. Use the precompiled version `graph-simple:1.0.0` that you pushed to your container registry.
213
+
This example converts temperature data from Fahrenheit to Celsius by using a WASM module. The [temperature module source code](https://github.com/Azure-Samples/explore-iot-operations/tree/wasm/samples/wasm/operators/temperature) is available on GitHub. Use the precompiled version `graph-simple:1.0.0` that you pushed to your container registry.
214
214
215
215
### How it works
216
216
217
-
The [graph definition](https://github.com/Azure-Samples/explore-iot-operations/blob/wasm/samples/wasm/graph-simple.yaml) creates a simple three-stage pipeline:
217
+
The [graph definition](https://github.com/Azure-Samples/explore-iot-operations/blob/wasm/samples/wasm/graph-simple.yaml) creates a simple, three-stage pipeline:
218
218
219
219
1.**Source**: Receives temperature data from MQTT
220
220
2.**Map**: Processes data with the temperature WASM module
@@ -229,7 +229,7 @@ operations:
229
229
module: "temperature:1.0.0"
230
230
```
231
231
232
-
The [temperature module](https://github.com/Azure-Samples/explore-iot-operations/blob/wasm/samples/wasm/operators/temperature/src/lib.rs) converts Fahrenheit to Celsius using the standard formula `(F - 32) × 5/9 = C`:
232
+
The [temperature module](https://github.com/Azure-Samples/explore-iot-operations/blob/wasm/samples/wasm/operators/temperature/src/lib.rs) converts Fahrenheit to Celsius by using the standard formula `(F - 32) × 5/9 = C`:
233
233
234
234
```rust
235
235
if measurement.unit == MeasurementTemperatureUnit::Fahrenheit {
@@ -257,10 +257,10 @@ This configuration defines three nodes that implement the temperature conversion
257
257
The data flow graph resource "wraps" the graph definition artifact and connects its abstract source/sink operations to concrete endpoints:
258
258
259
259
- The graph definition's `source` operation connects to the data flow's source node (MQTT topic)
260
-
- The graph definition's `sink` operation connects to the data flow's destination node (MQTT topic)
260
+
- The graph definition's `sink` operation connects to the data flow's destination node (MQTT topic)
261
261
- The graph definition's processing operations run within the graph processing node
262
262
263
-
This separation allows the same graph definition to be deployed with different endpoints across environments while keeping the processing logic unchanged.
263
+
This separation lets you deploy the same graph definition with different endpoints across environments while keeping the processing logic unchanged.
Copy file name to clipboardExpand all lines: articles/iot-operations/connect-to-cloud/howto-develop-wasm-modules.md
+12-12Lines changed: 12 additions & 12 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -18,7 +18,7 @@ ai-usage: ai-assisted
18
18
>
19
19
> See the [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) for legal terms that apply to Azure features that are in beta, preview, or not yet released into general availability.
20
20
21
-
This article shows you how to develop custom WebAssembly (WASM) modules and graph definitions for Azure IoT Operations data flow graphs. You can create modules in Rust or Python to implement custom processing logic. You can also define graph configurations that specify how your modules connect into complete processing workflows.
21
+
This article shows you how to develop custom WebAssembly (WASM) modules and graph definitions for Azure IoT Operations data flow graphs. Create modules in Rust or Python to implement custom processing logic. Define graph configurations that specify how your modules connect into complete processing workflows.
22
22
23
23
## Overview
24
24
@@ -42,9 +42,9 @@ Data flow graphs build on the [Timely dataflow](https://docs.rs/timely/latest/ti
42
42
43
43
### Why timely dataflow?
44
44
45
-
Traditional stream processing systems face challenges with several issues. Out-of-order data can cause events to arrive later than expected. Partial results make it difficult to know when computations are complete. Coordination issues arise when synchronizing distributed processing.
45
+
Traditional stream processing systems have several challenges. Out-of-order data means events can arrive later than expected. Partial results make it hard to know when computations finish. Coordination issues happen when synchronizing distributed processing.
46
46
47
-
Timely dataflow solves problems through:
47
+
Timely dataflow solves these problems through:
48
48
49
49
#### Timestamps and progress tracking
50
50
@@ -89,10 +89,10 @@ Operators are the fundamental processing units based on [Timely dataflow operato
89
89
90
90
Modules are the implementation of operator logic as WASM code. A single module can implement multiple operator types. For example, a temperature module might provide:
91
91
92
-
- A map operator for unit conversion
93
-
- A filter operator for threshold checking
94
-
- A branch operator for routing decisions
95
-
- An accumulate operator for statistical aggregation
92
+
- A map operator for unit conversion.
93
+
- A filter operator for threshold checking.
94
+
- A branch operator for routing decisions.
95
+
- An accumulate operator for statistical aggregation.
96
96
97
97
### The relationship
98
98
@@ -206,7 +206,7 @@ source ~/.bashrc
206
206
207
207
# [Python](#tab/python)
208
208
209
-
Python development uses componentize-py with WebAssembly Interface Types (WIT) for code generation. No other environment configuration is required beyond installing the prerequisites.
209
+
Python development uses componentize-py with WebAssembly Interface Types (WIT) for code generation. You don't need any other environment configuration beyond installing the prerequisites.
Python WASM development doesn't use a traditional SDK. Instead, you work with generated bindings from WebAssembly Interface Types (WIT). These bindings provide:
425
+
Python WASM development doesn't use a traditional SDK. Instead, you use generated bindings from WebAssembly Interface Types (WIT). These bindings give you:
426
426
427
427
Typed interfaces for operators:
428
428
```python
@@ -560,7 +560,7 @@ The Python examples demonstrate working implementations that show the complete s
560
560
561
561
## Graph definitions and WASM integration
562
562
563
-
Graph definitions are central to WASM development as they define how your modules connect into processing workflows. Understanding the relationship between graph definitions and data flow graphs is essential for effective development.
563
+
Graph definitions are central to WASM development because they define how your modules connect to processing workflows. Understanding the relationship between graph definitions and data flow graphs helps you develop effectively.
564
564
565
565
### Graph definition structure
566
566
@@ -613,7 +613,7 @@ For working examples, see:
613
613
614
614
### How graph definitions become data flows
615
615
616
-
The relationship between graph definitions and Azure IoT Operations data flow graphs works as follows:
616
+
Here's how graph definitions and Azure IoT Operations data flow graphs relate:
617
617
618
618
- **Graph definition artifact**: Your YAML file defines the internal processing logic with source/sink operations as abstract endpoints
619
619
- **WASM modules**: Referenced modules implement the actual processing operators
0 commit comments