Skip to content

Commit 8a2388b

Browse files
Merge pull request #290569 from jlian/df-profile-prereq
Remove dataflow profile prereq
2 parents 04335a4 + 353b6aa commit 8a2388b

8 files changed

+1
-8
lines changed

articles/iot-operations/connect-to-cloud/howto-configure-adlsv2-endpoint.md

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,6 @@ To send data to Azure Data Lake Storage Gen2 in Azure IoT Operations, you can co
2121
## Prerequisites
2222

2323
- An instance of [Azure IoT Operations](../deploy-iot-ops/howto-deploy-iot-operations.md)
24-
- A [configured dataflow profile](howto-configure-dataflow-profile.md)
2524
- An [Azure Data Lake Storage Gen2 account](../../storage/blobs/create-data-lake-storage-account.md)
2625
- A pre-created storage container in the storage account
2726

articles/iot-operations/connect-to-cloud/howto-configure-adx-endpoint.md

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,6 @@ To send data to Azure Data Explorer in Azure IoT Operations, you can configure a
2121
## Prerequisites
2222

2323
- An instance of [Azure IoT Operations](../deploy-iot-ops/howto-deploy-iot-operations.md)
24-
- A [configured dataflow profile](howto-configure-dataflow-profile.md)
2524
- An **Azure Data Explorer cluster**. Follow the **Full cluster** steps in the [Quickstart: Create an Azure Data Explorer cluster and database](/azure/data-explorer/create-cluster-and-database). The *free cluster* option doesn't work for this scenario.
2625

2726

articles/iot-operations/connect-to-cloud/howto-configure-fabric-endpoint.md

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,6 @@ To send data to Microsoft Fabric OneLake in Azure IoT Operations, you can config
2121
## Prerequisites
2222

2323
- An instance of [Azure IoT Operations](../deploy-iot-ops/howto-deploy-iot-operations.md)
24-
- A [configured dataflow profile](howto-configure-dataflow-profile.md)
2524
- **Microsoft Fabric OneLake**. See the following steps to create a workspace and lakehouse.
2625
- [Create a workspace](/fabric/get-started/create-workspaces). The default *my workspace* isn't supported.
2726
- [Create a lakehouse](/fabric/onelake/create-lakehouse-onelake).

articles/iot-operations/connect-to-cloud/howto-configure-fabric-real-time-intelligence.md

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,6 @@ To send data to Microsoft Fabric Real-Time Intelligence from Azure IoT Operation
1919
## Prerequisites
2020

2121
- An [Azure IoT Operations](../deploy-iot-ops/howto-deploy-iot-operations.md) instance
22-
- A [configured dataflow profile](howto-configure-dataflow-profile.md)
2322
- [Create a Fabric workspace](/fabric/get-started/create-workspaces). The default *my workspace* isn't supported.
2423
- [Create an Event Stream](/fabric/real-time-intelligence/event-streams/create-manage-an-eventstream#create-an-eventstream)
2524
- [Add a Custom Endpoint as a source](/fabric/real-time-intelligence/event-streams/add-source-custom-app#add-custom-endpoint-data-as-a-source)

articles/iot-operations/connect-to-cloud/howto-configure-kafka-endpoint.md

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,6 @@ To set up bi-directional communication between Azure IoT Operations and Apache K
2121
## Prerequisites
2222

2323
- An instance of [Azure IoT Operations](../deploy-iot-ops/howto-deploy-iot-operations.md)
24-
- A [configured dataflow profile](howto-configure-dataflow-profile.md)
2524

2625
## Azure Event Hubs
2726

articles/iot-operations/connect-to-cloud/howto-configure-local-storage-endpoint.md

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,6 @@ To send data to local storage in Azure IoT Operations, you can configure a dataf
2121
## Prerequisites
2222

2323
- An instance of [Azure IoT Operations](../deploy-iot-ops/howto-deploy-iot-operations.md)
24-
- A [configured dataflow profile](howto-configure-dataflow-profile.md)
2524
- A [PersistentVolumeClaim (PVC)](https://kubernetes.io/docs/concepts/storage/persistent-volumes/)
2625

2726
## Create a local storage dataflow endpoint

articles/iot-operations/connect-to-cloud/howto-configure-mqtt-endpoint.md

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,6 @@ MQTT dataflow endpoints are used for MQTT sources and destinations. You can conf
2121
## Prerequisites
2222

2323
- An instance of [Azure IoT Operations](../deploy-iot-ops/howto-deploy-iot-operations.md)
24-
- A [configured dataflow profile](howto-configure-dataflow-profile.md)
2524

2625
## Azure IoT Operations local MQTT broker
2726

articles/iot-operations/connect-to-cloud/howto-create-dataflow.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -54,7 +54,7 @@ You can deploy dataflows as soon as you have an instance of [Azure IoT Operation
5454

5555
### Dataflow profile
5656

57-
The dataflow profile specifies the number of instances for the dataflows under it to use. If you don't need multiple groups of dataflows with different scaling settings, you can use the default dataflow profile. To learn how to configure a dataflow profile, see [Configure dataflow profiles](howto-configure-dataflow-profile.md).
57+
If you don't need different scaling settings for your dataflows, use the [default dataflow profile](./howto-configure-dataflow-profile.md#default-dataflow-profile) provided by Azure IoT Operations. To learn how to configure a dataflow profile, see [Configure dataflow profiles](howto-configure-dataflow-profile.md).
5858

5959
### Dataflow endpoints
6060

0 commit comments

Comments
 (0)