Skip to content

Commit cae3d35

Browse files
Merge pull request #297046 from asergaz/ACSA-changes
Adding more clarity on Endpoint type table
2 parents 3641eb4 + aedd28e commit cae3d35

File tree

2 files changed

+2
-2
lines changed

2 files changed

+2
-2
lines changed

articles/iot-operations/connect-to-cloud/howto-configure-dataflow-endpoint.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@ Use the following table to choose the endpoint type to configure:
2626
| [Data Lake](howto-configure-adlsv2-endpoint.md) | For uploading data to Azure Data Lake Gen2 storage accounts. | No | Yes |
2727
| [Microsoft Fabric OneLake](howto-configure-fabric-endpoint.md) | For uploading data to Microsoft Fabric OneLake lakehouses. | No | Yes |
2828
| [Azure Data Explorer](howto-configure-adx-endpoint.md) | For uploading data to Azure Data Explorer databases. | No | Yes |
29-
| [Local storage](howto-configure-local-storage-endpoint.md) | For sending data to a locally available persistent volume, through which you can upload data via Azure Container Storage enabled by Azure Arc edge volumes. | No | Yes |
29+
| [Local storage](howto-configure-local-storage-endpoint.md) | For sending data to a locally available persistent volume, optionally configurable with Azure Container Storage enabled by Azure Arc. | No | Yes |
3030

3131
> [!IMPORTANT]
3232
> Storage endpoints require a [schema for serialization](./concept-schema-registry.md). To use data flow with Microsoft Fabric OneLake, Azure Data Lake Storage, Azure Data Explorer, or Local Storage, you must [specify a schema reference](./howto-create-dataflow.md#serialize-data-according-to-a-schema).

articles/iot/iot-overview-message-processing.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -45,7 +45,7 @@ To route messages from your assets to various endpoints, Azure IoT Operations us
4545
| [Data Lake](../iot-operations/connect-to-cloud/howto-configure-adlsv2-endpoint.md) | For uploading data to Azure Data Lake Gen2 storage accounts. |
4646
| [Microsoft Fabric OneLake](../iot-operations/connect-to-cloud/howto-configure-fabric-endpoint.md) | For uploading data to Microsoft Fabric OneLake lakehouses. |
4747
| [Azure Data Explorer](../iot-operations/connect-to-cloud/howto-configure-adx-endpoint.md) | For uploading data to Azure Data Explorer databases. |
48-
| [Local storage](../iot-operations/connect-to-cloud/howto-configure-local-storage-endpoint.md) | For sending data to a locally available persistent volume, through which you can upload data via Azure Container Storage enabled by Azure Arc edge volumes. |
48+
| [Local storage](../iot-operations/connect-to-cloud/howto-configure-local-storage-endpoint.md) | For sending data to a locally available persistent volume, optionally configurable with Azure Container Storage enabled by Azure Arc. |
4949

5050
The operations experience web UI provides a no-code environment for building and running your data flows.
5151

0 commit comments

Comments
 (0)