Skip to content

Commit d8d496e

Browse files
authored
Merge pull request #288959 from MicrosoftDocs/release-aio-ga
AIO [GA] - staging PR
2 parents e2afdf1 + 734c2d1 commit d8d496e

File tree

260 files changed

+7817
-2874
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

260 files changed

+7817
-2874
lines changed

articles/iot-operations/.openpublishing.redirection.iot-operations.json

Lines changed: 22 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -182,8 +182,8 @@
182182
},
183183
{
184184
"source_path_from_root": "/articles/iot-operations/manage-mqtt-connectivity/overview-iot-mq.md",
185-
"redirect_url": "/azure/iot-operations/manage-mqtt-broker/overview-iot-mq",
186-
"redirect_document_id": true
185+
"redirect_url": "/azure/iot-operations/manage-mqtt-broker/overview-broker",
186+
"redirect_document_id": false
187187
},
188188
{
189189
"source_path_from_root": "/articles/iot-operations/manage-mqtt-connectivity/howto-test-connection.md",
@@ -524,6 +524,26 @@
524524
"source_path_from_root": "/articles/iot-operations/get-started-end-to-end-sample/quickstart-upload-telemetry-to-cloud.md",
525525
"redirect_url": "/azure/iot-operations/get-started-end-to-end-sample/quickstart-configure",
526526
"redirect_document_id": false
527+
},
528+
{
529+
"source_path_from_root": "/articles/iot-operations/manage-mqtt-broker/overview-iot-mq.md",
530+
"redirect_url": "/azure/iot-operations/manage-mqtt-broker/overview-broker",
531+
"redirect_document_id": true
532+
},
533+
{
534+
"source_path_from_root": "/articles/iot-operations/reference/glossary.md",
535+
"redirect_url": "/azure/iot/iot-glossary?toc=/azure/iot-operations/toc.json&bc=/azure/iot-operations/breadcrumb/toc.json",
536+
"redirect_document_id": false
537+
},
538+
{
539+
"source_path_from_root": "/articles/iot-operations/reference/observability-metrics-mq.md",
540+
"redirect_url": "/azure/iot-operations/reference/observability-metrics-mqtt-broker",
541+
"redirect_document_id": false
542+
},
543+
{
544+
"source_path_from_root": "/articles/iot-operations/view-analyze-telemetry/tutorial-real-time-dashboard-fabric.md",
545+
"redirect_url": "/azure/iot-operations/end-to-end-tutorials/tutorial-add-assets",
546+
"redirect_document_id": false
527547
}
528548
]
529549
}

articles/iot-operations/configure-observability-monitoring/howto-clean-up-observability-resources.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ description: How to clean up shared and data collection observability resources
44
author: kgremban
55
ms.author: kgremban
66
ms.topic: how-to
7-
ms.date: 02/27/2024
7+
ms.date: 10/22/2024
88

99
# CustomerIntent: As an IT admin or operator, I want to be able to clean up and remove
1010
# observability resources installed on my cluster, without removing the cluster.

articles/iot-operations/configure-observability-monitoring/howto-configure-observability.md

Lines changed: 3 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -13,18 +13,16 @@ ms.date: 10/22/2024
1313

1414
# Deploy observability resources and set up logs
1515

16-
[!INCLUDE [public-preview-note](../includes/public-preview-note.md)]
17-
1816
Observability provides visibility into every layer of your Azure IoT Operations configuration. It gives you insight into the actual behavior of issues, which increases the effectiveness of site reliability engineering. Azure IoT Operations offers observability through custom curated Grafana dashboards that are hosted in Azure. These dashboards are powered by Azure Monitor managed service for Prometheus and by Container Insights. This guide shows you how to set up Azure Managed Prometheus and Grafana and enable monitoring for your Azure Arc cluster.
1917

2018
Complete the steps in this article *before* deploying Azure IoT Operations to your cluster.
2119

2220
## Prerequisites
2321

2422
* An Arc-enabled Kubernetes cluster.
25-
* Azure CLI installed on your development machine. For instructions, see [How to install the Azure CLI](/cli/azure/install-azure-cli).
26-
* Helm installed on your development machine. For instructions, see [Install Helm](https://helm.sh/docs/intro/install/).
27-
* Kubectl installed on your development machine. For instructions, see [Install Kubernetes tools](https://kubernetes.io/docs/tasks/tools/).
23+
* Azure CLI installed on your cluster machine. For instructions, see [How to install the Azure CLI](/cli/azure/install-azure-cli).
24+
* Helm installed on your cluster machine. For instructions, see [Install Helm](https://helm.sh/docs/intro/install/).
25+
* Kubectl installed on your cluster machine. For instructions, see [Install Kubernetes tools](https://kubernetes.io/docs/tasks/tools/).
2826

2927
## Create resources in Azure
3028

articles/iot-operations/connect-to-cloud/concept-dataflow-conversions.md

Lines changed: 25 additions & 57 deletions
Original file line numberDiff line numberDiff line change
@@ -5,15 +5,15 @@ author: PatAltimore
55
ms.author: patricka
66
ms.subservice: azure-data-flows
77
ms.topic: concept-article
8-
ms.date: 10/30/2024
8+
ms.date: 11/11/2024
99

1010
#CustomerIntent: As an operator, I want to understand how to use dataflow conversions to transform data.
1111
ms.service: azure-iot-operations
1212
---
1313

1414
# Convert data by using dataflow conversions
1515

16-
[!INCLUDE [public-preview-note](../includes/public-preview-note.md)]
16+
[!INCLUDE [kubernetes-management-preview-note](../includes/kubernetes-management-preview-note.md)]
1717

1818
You can use dataflow conversions to transform data in Azure IoT Operations. The *conversion* element in a dataflow is used to compute values for output fields. You can use input fields, available operations, data types, and type conversions in dataflow conversions.
1919

@@ -30,7 +30,7 @@ output: 'ColorProperties.*'
3030
expression: '($1 + $2) / 2'
3131
```
3232

33-
# [Kubernetes](#tab/kubernetes)
33+
# [Kubernetes (preview)](#tab/kubernetes)
3434

3535
```yaml
3636
- inputs:
@@ -66,7 +66,7 @@ output: 'ColorProperties.*'
6666
expression: '($1, $2, $3, $4)'
6767
```
6868
69-
# [Kubernetes](#tab/kubernetes)
69+
# [Kubernetes (preview)](#tab/kubernetes)
7070
7171
```yaml
7272
- inputs:
@@ -84,7 +84,7 @@ In this example, the conversion results in an array containing the values of `[M
8484

8585
## Data types
8686

87-
Different serialization formats support various data types. For instance, JSON offers a few primitive types: string, number, Boolean, and null. Also included are arrays of these primitive types. In contrast, other serialization formats like Avro have a more complex type system, including integers with multiple bit field lengths and timestamps with different resolutions. Examples are milliseconds and microseconds.
87+
Different serialization formats support various data types. For instance, JSON offers a few primitive types: string, number, Boolean, and null. It also includes arrays of these primitive types.
8888

8989
When the mapper reads an input property, it converts it into an internal type. This conversion is necessary for holding the data in memory until it's written out into an output field. The conversion to an internal type happens regardless of whether the input and output serialization formats are the same.
9090

@@ -105,11 +105,7 @@ The internal representation utilizes the following data types:
105105

106106
### Input record fields
107107

108-
When an input record field is read, its underlying type is converted into one of these internal type variants. The internal representation is versatile enough to handle most input types with minimal or no conversion. However, some input types require conversion or are unsupported. Some examples:
109-
110-
* **Avro** `UUID` **type**: It's converted to a `string` because there's no specific `UUID` type in the internal representation.
111-
* **Avro** `decimal` **type**: It isn't supported by the mapper, so fields of this type can't be included in mappings.
112-
* **Avro** `duration` **type**: Conversion can vary. If the `months` field is set, it's unsupported. If only `days` and `milliseconds` are set, it's converted to the internal `duration` representation.
108+
When an input record field is read, its underlying type is converted into one of these internal type variants. The internal representation is versatile enough to handle most input types with minimal or no conversion.
113109

114110
For some formats, surrogate types are used. For example, JSON doesn't have a `datetime` type and instead stores `datetime` values as strings formatted according to ISO8601. When the mapper reads such a field, the internal representation remains a string.
115111

@@ -126,10 +122,6 @@ The mapper is designed to be flexible by converting internal types into output t
126122
* Converted to `0`/`1` if the output field is numerical.
127123
* Converted to `true`/`false` if the output field is string.
128124

129-
### Explicit type conversions
130-
131-
Although the automatic conversions operate as you might expect based on common implementation practices, there are instances where the right conversion can't be determined automatically and results in an *unsupported* error. To address these situations, several conversion functions are available to explicitly define how data should be transformed. These functions provide more control over how data is converted and help maintain data integrity even when automatic methods fall short.
132-
133125
### Use a conversion formula with types
134126

135127
In mappings, an optional formula can specify how data from the input is processed before being written to the output field. If no formula is specified, the mapper copies the input field to the output by using the internal type and conversion rules.
@@ -173,7 +165,7 @@ output: 'Measurement'
173165
expression: 'min($1)'
174166
```
175167

176-
# [Kubernetes](#tab/kubernetes)
168+
# [Kubernetes (preview)](#tab/kubernetes)
177169

178170
```yaml
179171
- inputs:
@@ -186,29 +178,6 @@ expression: 'min($1)'
186178

187179
This configuration selects the smallest value from the `Measurements` array for the output field.
188180

189-
It's also possible to use functions that result in a new array:
190-
191-
# [Bicep](#tab/bicep)
192-
193-
```bicep
194-
inputs: [
195-
'Measurements' // - $1
196-
]
197-
output: 'Measurements'
198-
expression: 'take($1, 10)' // taking at max 10 items
199-
```
200-
201-
# [Kubernetes](#tab/kubernetes)
202-
203-
```yaml
204-
- inputs:
205-
- Measurements # - $1
206-
output: Measurements
207-
expression: take($1, 10) # taking at max 10 items
208-
```
209-
210-
---
211-
212181
Arrays can also be created from multiple single values:
213182

214183
# [Bicep](#tab/bicep)
@@ -224,7 +193,7 @@ output: 'stats'
224193
expression: '($1, $2, $3, $4)'
225194
```
226195

227-
# [Kubernetes](#tab/kubernetes)
196+
# [Kubernetes (preview)](#tab/kubernetes)
228197

229198
```yaml
230199
- inputs:
@@ -282,7 +251,7 @@ output: 'BaseSalary'
282251
expression: 'if($1 == (), $2, $1)'
283252
```
284253

285-
# [Kubernetes](#tab/kubernetes)
254+
# [Kubernetes (preview)](#tab/kubernetes)
286255

287256
```yaml
288257
- inputs:
@@ -302,17 +271,22 @@ The `conversion` uses the `if` function that has three parameters:
302271

303272
## Available functions
304273

305-
Functions can be used in the conversion formula to perform various operations:
274+
Dataflows provide a set of built-in functions that can be used in conversion formulas. These functions can be used to perform common operations like arithmetic, comparison, and string manipulation. The available functions are:
306275

307-
* `min` to select a single item from an array
308-
* `if` to select between values
309-
* String manipulation (for example, `uppercase()`)
310-
* Explicit conversion (for example, `ISO8601_datetime`)
311-
* Aggregation (for example, `avg()`)
276+
| Function | Description | Examples |
277+
|----------|-------------|---------|
278+
| `min` | Return the minimum value from an array. | `min(2, 3, 1)` returns `1`, `min($1)` returns the minimum value from the array `$1` |
279+
| `max` | Return the maximum value from an array. | `max(2, 3, 1)` returns `3`, `max($1)` returns the maximum value from the array `$1` |
280+
| `if` | Return between values based on a condition. | `if($1 > 10, 'High', 'Low')` returns `'High'` if `$1` is greater than `10`, otherwise `'Low'` |
281+
| `len` | Return the character length of a string or the number of elements in a tuple. | `len("Azure")` returns `5`, `len(1, 2, 3)` returns `3`, `len($1)` returns the number of elements in the array `$1` |
282+
| `floor` | Return the largest integer less than or equal to a number. | `floor(2.9)` returns `2` |
283+
| `round` | Return the nearest integer to a number, rounding half-way cases away from 0.0. | `round(2.5)` returns `3` |
284+
| `ceil` | Return the smallest integer greater than or equal to a number. | `ceil(2.1)` returns `3` |
285+
| `scale` | Scale a value from one range to another. | `scale($1, 0, 10, 0, 100)` scales the input value from the range 0 to 10 to the range 0 to 100 |
312286

313-
## Available operations
287+
### Conversion functions
314288

315-
Dataflows offer a wide range of out-of-the-box conversion functions that allow users to easily perform unit conversions without the need for complex calculations. These predefined functions cover common conversions such as temperature, pressure, length, weight, and volume. The following list shows the available conversion functions, along with their corresponding formulas and function names:
289+
Dataflows provide several built-in conversion functions for common unit conversions like temperature, pressure, length, weight, and volume. Here are some examples:
316290

317291
| Conversion | Formula | Function name |
318292
| --- | --- | --- |
@@ -323,7 +297,7 @@ Dataflows offer a wide range of out-of-the-box conversion functions that allow u
323297
| Lbs to kg | Kg = lbs * 0.453592 | `lbToKg` |
324298
| Gallons to liters | Liters = gallons * 3.78541 | `galToL` |
325299

326-
In addition to these unidirectional conversions, we also support the reverse calculations:
300+
Reverse conversions are also supported:
327301

328302
| Conversion | Formula | Function name |
329303
| --- | --- | --- |
@@ -334,15 +308,9 @@ In addition to these unidirectional conversions, we also support the reverse cal
334308
| Kg to lbs | Lbs = kg / 0.453592 | `kgToLb` |
335309
| Liters to gallons | Gallons = liters / 3.78541 | `lToGal` |
336310

337-
These functions are designed to simplify the conversion process. They allow users to input values in one unit and receive the corresponding value in another unit effortlessly.
338-
339-
We also provide a scaling function to scale the range of value to the user-defined range. For the example `scale($1,0,10,0,100)`, the input value is scaled from the range 0 to 10 to the range 0 to 100.
340-
341-
Moreover, users have the flexibility to define their own conversion functions by using simple mathematical formulas. Our system supports basic operators such as addition (`+`), subtraction (`-`), multiplication (`*`), and division (`/`). These operators follow standard rules of precedence. For example, multiplication and division are performed before addition and subtraction. Precedence can be adjusted by using parentheses to ensure the correct order of operations. This capability empowers users to customize their unit conversions to meet specific needs or preferences, enhancing the overall utility and versatility of the system.
342-
343-
For more complex calculations, functions like `sqrt` (which finds the square root of a number) are also available.
311+
Additionally, you can define your own conversion functions using basic mathematical formulas. The system supports operators like addition (`+`), subtraction (`-`), multiplication (`*`), and division (`/`). These operators follow standard rules of precedence, which can be adjusted using parentheses to ensure the correct order of operations. This allows you to customize unit conversions to meet specific needs.
344312

345-
### Available arithmetic, comparison, and Boolean operators grouped by precedence
313+
## Available operators by precedence
346314

347315
| Operator | Description |
348316
|----------|-------------|

articles/iot-operations/connect-to-cloud/concept-dataflow-enrich.md

Lines changed: 13 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -5,18 +5,20 @@ author: PatAltimore
55
ms.author: patricka
66
ms.subservice: azure-data-flows
77
ms.topic: concept-article
8-
ms.date: 10/30/2024
8+
ms.date: 11/13/2024
99

1010
#CustomerIntent: As an operator, I want to understand how to create a dataflow to enrich data sent to endpoints.
1111
ms.service: azure-iot-operations
1212
---
1313

1414
# Enrich data by using dataflows
1515

16-
[!INCLUDE [public-preview-note](../includes/public-preview-note.md)]
16+
[!INCLUDE [kubernetes-management-preview-note](../includes/kubernetes-management-preview-note.md)]
1717

1818
You can enrich data by using the *contextualization datasets* function. When incoming records are processed, you can query these datasets based on conditions that relate to the fields of the incoming record. This capability allows for dynamic interactions. Data from these datasets can be used to supplement information in the output fields and participate in complex calculations during the mapping process.
1919

20+
To load sample data into the state store, use the [state store CLI](https://github.com/Azure-Samples/explore-iot-operations/tree/main/tools/state-store-cli).
21+
2022
For example, consider the following dataset with a few records, represented as JSON records:
2123

2224
```json
@@ -32,7 +34,7 @@ For example, consider the following dataset with a few records, represented as J
3234
}
3335
```
3436

35-
The mapper accesses the reference dataset stored in the Azure IoT Operations [distributed state store (DSS)](../create-edge-apps/concept-about-state-store-protocol.md) by using a key value based on a *condition* specified in the mapping configuration. Key names in the DSS correspond to a dataset in the dataflow configuration.
37+
The mapper accesses the reference dataset stored in the Azure IoT Operations [state store](../create-edge-apps/concept-about-state-store-protocol.md) by using a key value based on a *condition* specified in the mapping configuration. Key names in the state store correspond to a dataset in the dataflow configuration.
3638

3739
# [Bicep](#tab/bicep)
3840

@@ -49,7 +51,7 @@ datasets: [
4951
]
5052
```
5153

52-
# [Kubernetes](#tab/kubernetes)
54+
# [Kubernetes (preview)](#tab/kubernetes)
5355

5456
```yaml
5557
datasets:
@@ -64,7 +66,7 @@ datasets:
6466
6567
When a new record is being processed, the mapper performs the following steps:
6668
67-
* **Data request:** The mapper sends a request to the DSS to retrieve the dataset stored under the key `Position`.
69+
* **Data request:** The mapper sends a request to the state store to retrieve the dataset stored under the key `Position`.
6870
* **Record matching:** The mapper then queries this dataset to find the first record where the `Position` field in the dataset matches the `Position` field of the incoming record.
6971

7072
# [Bicep](#tab/bicep)
@@ -86,7 +88,7 @@ When a new record is being processed, the mapper performs the following steps:
8688
}
8789
```
8890

89-
# [Kubernetes](#tab/kubernetes)
91+
# [Kubernetes (preview)](#tab/kubernetes)
9092

9193
```yaml
9294
- inputs:
@@ -102,7 +104,7 @@ When a new record is being processed, the mapper performs the following steps:
102104

103105
---
104106

105-
In this example, the `WorkingHours` field is added to the output record, while the `BaseSalary` is used conditionally only when the incoming record doesn't contain the `BaseSalary` field (or the value is `null` if it's a nullable field). The request for the contextualization data doesn't happen with every incoming record. The mapper requests the dataset and then it receives notifications from DSS about the changes, while it uses a cached version of the dataset.
107+
In this example, the `WorkingHours` field is added to the output record, while the `BaseSalary` is used conditionally only when the incoming record doesn't contain the `BaseSalary` field (or the value is `null` if it's a nullable field). The request for the contextualization data doesn't happen with every incoming record. The mapper requests the dataset and then it receives notifications from the state store about the changes, while it uses a cached version of the dataset.
106108

107109
It's possible to use multiple datasets:
108110

@@ -129,7 +131,7 @@ datasets: [
129131
]
130132
```
131133

132-
# [Kubernetes](#tab/kubernetes)
134+
# [Kubernetes (preview)](#tab/kubernetes)
133135

134136
```yaml
135137
datasets:
@@ -159,7 +161,7 @@ inputs: [
159161
]
160162
```
161163

162-
# [Kubernetes](#tab/kubernetes)
164+
# [Kubernetes (preview)](#tab/kubernetes)
163165

164166
```yaml
165167
- inputs:
@@ -169,7 +171,7 @@ inputs: [
169171

170172
---
171173

172-
The input references use the key of the dataset like `position` or `permission`. If the key in DSS is inconvenient to use, you can define an alias:
174+
The input references use the key of the dataset like `position` or `permission`. If the key in state store is inconvenient to use, you can define an alias:
173175

174176
# [Bicep](#tab/bicep)
175177

@@ -186,7 +188,7 @@ datasets: [
186188
]
187189
```
188190

189-
# [Kubernetes](#tab/kubernetes)
191+
# [Kubernetes (preview)](#tab/kubernetes)
190192

191193
```yaml
192194
datasets:

0 commit comments

Comments
 (0)