Skip to content

Commit 0a23a61

Browse files
authored
Merge pull request #14089 from nschonni/patch-2
typo: Double word "uses"
2 parents 4448cee + 418f18c commit 0a23a61

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

articles/log-analytics/log-analytics-create-pipeline-datacollector-api.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -38,7 +38,7 @@ We are using a classic ETL-type logic to design our pipeline. The architecture w
3838

3939
This article will not cover how to create data or [upload it to an Azure Blob Storage account](../storage/blobs/storage-upload-process-images.md). Rather, we pick the flow up as soon as a new file is uploaded to the blob. From here:
4040

41-
1. A process will detect that new data has been uploaded. Our example uses uses an [Azure Logic App](../logic-apps/logic-apps-overview.md), which has available a trigger to detect new data being uploaded to a blob.
41+
1. A process will detect that new data has been uploaded. Our example uses an [Azure Logic App](../logic-apps/logic-apps-overview.md), which has available a trigger to detect new data being uploaded to a blob.
4242

4343
2. A processor reads this new data and converts it to JSON, the format required by Log Analytics. In this example, we use an [Azure Function](../azure-functions/functions-overview.md) as a lightweight, cost-efficient way of executing our processing code. The function is kicked off by the same Logic App that we used to detect a the new data.
4444

@@ -169,4 +169,4 @@ This article presented a working prototype, the logic behind which can be applie
169169

170170

171171
## Next steps
172-
Learn more about the [Data Collector API](log-analytics-data-collector-api.md) to write data to Log Analytics from any REST API client.
172+
Learn more about the [Data Collector API](log-analytics-data-collector-api.md) to write data to Log Analytics from any REST API client.

0 commit comments

Comments
 (0)