Skip to content

Commit 8812a6b

Browse files
committed
Dynamic destination topic
1 parent a7caa9b commit 8812a6b

File tree

2 files changed

+75
-2
lines changed

2 files changed

+75
-2
lines changed

articles/iot-operations/connect-to-cloud/howto-create-dataflow.md

Lines changed: 74 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1291,7 +1291,7 @@ Similar to data sources, data destination is a concept that is used to keep the
12911291

12921292
| Endpoint type | Data destination meaning | Description |
12931293
| - | - | - |
1294-
| MQTT (or Event Grid) | Topic | The MQTT topic where the data is sent. Only static topics are supported, no wildcards. |
1294+
| MQTT (or Event Grid) | Topic | The MQTT topic where the data is sent. Supports both static topics and dynamic topic translation using variables like `${inputTopic}` and `${inputTopic.index}`. For more information, see [Dynamic destination topics](#dynamic-destination-topics). |
12951295
| Kafka (or Event Hubs) | Topic | The Kafka topic where the data is sent. Only static topics are supported, no wildcards. If the endpoint is an Event Hubs namespace, the data destination is the individual event hub within the namespace. |
12961296
| Azure Data Lake Storage | Container | The container in the storage account. Not the table. |
12971297
| Microsoft Fabric OneLake | Table or Folder | Corresponds to the configured [path type for the endpoint](howto-configure-fabric-endpoint.md#onelake-path-type). |
@@ -1337,6 +1337,18 @@ Or, if you have custom event hub endpoint, the configuration would look like:
13371337
}
13381338
}
13391339
```
1340+
1341+
For MQTT endpoints, you can also use dynamic topic variables. For example, to route messages from `factory/1/data` to `processed/factory/1`:
1342+
1343+
```json
1344+
{
1345+
"destinationSettings": {
1346+
"endpointRef": "default",
1347+
"dataDestination": "processed/factory/${inputTopic.2}"
1348+
}
1349+
}
1350+
```
1351+
13401352
# [Bicep](#tab/bicep)
13411353

13421354
The syntax is the same for all data flow endpoints:
@@ -1375,6 +1387,15 @@ destinationSettings: {
13751387
}
13761388
```
13771389

1390+
For MQTT endpoints, you can also use dynamic topic variables:
1391+
1392+
```bicep
1393+
destinationSettings: {
1394+
endpointRef: 'default'
1395+
dataDestination: 'processed/factory/${inputTopic.2}'
1396+
}
1397+
```
1398+
13781399
# [Kubernetes (preview)](#tab/kubernetes)
13791400

13801401
The syntax is the same for all data flow endpoints:
@@ -1409,8 +1430,30 @@ destinationSettings:
14091430
dataDestination: my-container
14101431
```
14111432

1433+
For MQTT endpoints, you can also use dynamic topic variables:
1434+
1435+
```yaml
1436+
destinationSettings:
1437+
endpointRef: default
1438+
dataDestination: processed/factory/${inputTopic.2}
1439+
```
1440+
14121441
---
14131442

1443+
### Dynamic destination topics
1444+
1445+
For MQTT endpoints, you can use dynamic topic variables in the `dataDestination` field to route messages based on the source topic structure. The following variables are available:
1446+
1447+
- `${inputTopic}` - The full original input topic
1448+
- `${inputTopic.index}` - A segment of the input topic (index starts at 1)
1449+
1450+
For example, `processed/factory/${inputTopic.2}` routes messages from `factory/1/data` to `processed/factory/1`. Topic segments are 1-indexed, and leading/trailing slashes are ignored.
1451+
1452+
If a topic variable cannot be resolved (for example, `${inputTopic.5}` when the input topic only has 3 segments), the message is dropped and a warning is logged. Wildcard characters (`#` and `+`) are not allowed in destination topics.
1453+
1454+
> [!NOTE]
1455+
> The characters `$`, `{`, and `}` are valid in MQTT topic names, so a topic like `factory/$inputTopic.2` is acceptable but incorrect if you intended to use the dynamic topic variable.
1456+
14141457
## Example
14151458

14161459
The following example is a data flow configuration that uses the MQTT endpoint for the source and destination. The source filters the data from the MQTT topic `azure-iot-operations/data/thermostat`. The transformation converts the temperature to Fahrenheit and filters the data where the temperature multiplied by the humidity is less than 100000. The destination sends the data to the MQTT topic `factory`.
@@ -1504,6 +1547,35 @@ Here's an example command to create or update a data flow using the default data
15041547
az iot ops dataflow apply --resource-group myResourceGroup --instance myAioInstance --profile default --name data-flow --config-file ~/data-flow.json
15051548
```
15061549

1550+
Here's another example using dynamic topic translation to route messages from different thermostats to device-specific topics:
1551+
1552+
```json
1553+
{
1554+
"mode": "Enabled",
1555+
"operations": [
1556+
{
1557+
"operationType": "Source",
1558+
"sourceSettings": {
1559+
"dataSources": [
1560+
"thermostats/+/sensor/temperature"
1561+
],
1562+
"endpointRef": "default",
1563+
"serializationFormat": "Json"
1564+
}
1565+
},
1566+
{
1567+
"destinationSettings": {
1568+
"dataDestination": "processed/device/${inputTopic.2}/temperature",
1569+
"endpointRef": "default"
1570+
},
1571+
"operationType": "Destination"
1572+
}
1573+
]
1574+
}
1575+
```
1576+
1577+
This configuration processes messages from `thermostats/device1/sensor/temperature` and sends them to `processed/device/device1/temperature`.
1578+
15071579
# [Bicep](#tab/bicep)
15081580

15091581
```bicep
@@ -1714,6 +1786,7 @@ To ensure the data flow is working as expected, verify the following:
17141786
- When using Event Hubs as the source, each event hub in the namespace is a separate Kafka topic and must be specified as the data source.
17151787
- Transformation, if used, is configured with proper syntax, including proper [escaping of special characters](./concept-dataflow-mapping.md#escaping).
17161788
- When using storage type endpoints as destination, a [schema is specified](#serialize-data-according-to-a-schema).
1789+
- When using dynamic destination topics for MQTT endpoints, ensure that topic variables reference valid segments.
17171790

17181791
## Next steps
17191792

articles/iot-operations/connect-to-cloud/overview-dataflow.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,7 @@ Here are the key features of data flows.
3232
Data flows enable the ingestion, processing, and routing of the messages to specified sinks. You can specify:
3333

3434
- **Sources**: Where messages are ingested from
35-
- **Destinations**: Where messages are drained to
35+
- **Destinations**: Where messages are drained to, including support for dynamic topic routing based on message content for MQTT endpoints
3636
- **Transformations (optional)**: Configuration for data processing operations
3737

3838
### Transformation capabilities

0 commit comments

Comments
 (0)