Skip to content

Commit 4d36813

Browse files
committed
Add notice about using Kubernetes to manage resources
1 parent c5869ce commit 4d36813

20 files changed

+66
-0
lines changed

articles/iot-operations/connect-to-cloud/concept-dataflow-conversions.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -13,6 +13,8 @@ ms.service: azure-iot-operations
1313

1414
# Convert data by using dataflow conversions
1515

16+
[!INCLUDE [kubernetes-management-preview-note](../includes/kubernetes-management-preview-note.md)]
17+
1618
You can use dataflow conversions to transform data in Azure IoT Operations. The *conversion* element in a dataflow is used to compute values for output fields. You can use input fields, available operations, data types, and type conversions in dataflow conversions.
1719

1820
The dataflow conversion element is used to compute values for output fields:

articles/iot-operations/connect-to-cloud/concept-dataflow-enrich.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -13,6 +13,8 @@ ms.service: azure-iot-operations
1313

1414
# Enrich data by using dataflows
1515

16+
[!INCLUDE [kubernetes-management-preview-note](../includes/kubernetes-management-preview-note.md)]
17+
1618
You can enrich data by using the *contextualization datasets* function. When incoming records are processed, you can query these datasets based on conditions that relate to the fields of the incoming record. This capability allows for dynamic interactions. Data from these datasets can be used to supplement information in the output fields and participate in complex calculations during the mapping process.
1719

1820
For example, consider the following dataset with a few records, represented as JSON records:

articles/iot-operations/connect-to-cloud/concept-dataflow-mapping.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -14,6 +14,8 @@ ms.service: azure-iot-operations
1414

1515
# Map data by using dataflows
1616

17+
[!INCLUDE [kubernetes-management-preview-note](../includes/kubernetes-management-preview-note.md)]
18+
1719
Use the dataflow mapping language to transform data in Azure IoT Operations. The syntax is a simple, yet powerful, way to define mappings that transform data from one format to another. This article provides an overview of the dataflow mapping language and key concepts.
1820

1921
Mapping allows you to transform data from one format to another. Consider the following input record:

articles/iot-operations/connect-to-cloud/howto-configure-adlsv2-endpoint.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -14,6 +14,8 @@ ai-usage: ai-assisted
1414

1515
# Configure dataflow endpoints for Azure Data Lake Storage Gen2
1616

17+
[!INCLUDE [kubernetes-management-preview-note](../includes/kubernetes-management-preview-note.md)]
18+
1719
To send data to Azure Data Lake Storage Gen2 in Azure IoT Operations, you can configure a dataflow endpoint. This configuration allows you to specify the destination endpoint, authentication method, table, and other settings.
1820

1921
## Prerequisites

articles/iot-operations/connect-to-cloud/howto-configure-adx-endpoint.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -14,6 +14,8 @@ ai-usage: ai-assisted
1414

1515
# Configure dataflow endpoints for Azure Data Explorer
1616

17+
[!INCLUDE [kubernetes-management-preview-note](../includes/kubernetes-management-preview-note.md)]
18+
1719
To send data to Azure Data Explorer in Azure IoT Operations, you can configure a dataflow endpoint. This configuration allows you to specify the destination endpoint, authentication method, table, and other settings.
1820

1921
## Prerequisites

articles/iot-operations/connect-to-cloud/howto-configure-dataflow-endpoint.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -13,6 +13,8 @@ ms.date: 11/01/2024
1313

1414
# Configure dataflow endpoints
1515

16+
[!INCLUDE [kubernetes-management-preview-note](../includes/kubernetes-management-preview-note.md)]
17+
1618
To get started with dataflows, first create dataflow endpoints. A dataflow endpoint is the connection point for the dataflow. You can use an endpoint as a source or destination for the dataflow. Some endpoint types can be used as both sources and destinations, while others are for destinations only. A dataflow needs at least one source endpoint and one destination endpoint.
1719

1820
Use the following table to choose the endpoint type to configure:

articles/iot-operations/connect-to-cloud/howto-configure-dataflow-profile.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -13,6 +13,8 @@ ms.date: 10/30/2024
1313

1414
# Configure dataflow profile
1515

16+
[!INCLUDE [kubernetes-management-preview-note](../includes/kubernetes-management-preview-note.md)]
17+
1618
Dataflow profiles can be used to group dataflows together so that they share the same configuration. You can create multiple dataflow profiles to manage sets of different dataflow configurations.
1719

1820
The most important setting is the instance count, which determines the number of instances that run the dataflows. For example, you might have a dataflow profile with a single instance for development and testing, and another profile with multiple instances for production. Or, you might use a dataflow profile with low instance count for low-throughput dataflows and a profile with high instance count for high-throughput dataflows. Similarly, you can create a dataflow profile with different diagnostic settings for debugging purposes.

articles/iot-operations/connect-to-cloud/howto-configure-fabric-endpoint.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -14,6 +14,8 @@ ai-usage: ai-assisted
1414

1515
# Configure dataflow endpoints for Microsoft Fabric OneLake
1616

17+
[!INCLUDE [kubernetes-management-preview-note](../includes/kubernetes-management-preview-note.md)]
18+
1719
To send data to Microsoft Fabric OneLake in Azure IoT Operations, you can configure a dataflow endpoint. This configuration allows you to specify the destination endpoint, authentication method, table, and other settings.
1820

1921
## Prerequisites

articles/iot-operations/connect-to-cloud/howto-configure-kafka-endpoint.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -14,6 +14,8 @@ ai-usage: ai-assisted
1414

1515
# Configure Azure Event Hubs and Kafka dataflow endpoints
1616

17+
[!INCLUDE [kubernetes-management-preview-note](../includes/kubernetes-management-preview-note.md)]
18+
1719
To set up bi-directional communication between Azure IoT Operations and Apache Kafka brokers, you can configure a dataflow endpoint. This configuration allows you to specify the endpoint, Transport Layer Security (TLS), authentication, and other settings.
1820

1921
## Prerequisites

articles/iot-operations/connect-to-cloud/howto-configure-local-storage-endpoint.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -14,6 +14,8 @@ ai-usage: ai-assisted
1414

1515
# Configure dataflow endpoints for local storage
1616

17+
[!INCLUDE [kubernetes-management-preview-note](../includes/kubernetes-management-preview-note.md)]
18+
1719
To send data to local storage in Azure IoT Operations, you can configure a dataflow endpoint. This configuration allows you to specify the endpoint, authentication, table, and other settings.
1820

1921
## Prerequisites

0 commit comments

Comments
 (0)