You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
> * When uploading a file with the Azure Data Lake SDK, the initial file creation event has a size of 0, which is ignored by Azure Data Explorer during data ingestion. To ensure proper ingestion, set the `Close` parameter to `true`. This parameter causes the upload method to trigger a *FlushAndClose* event, indicating that the final update has been made and the file stream is closed.
357
-
> * To reduce traffic coming from Event Grid and optimize the ingestion of events into Azure Data Explorer, we recommend [filtering](ingest-data-event-grid-manual.md#create-an-event-grid-subscription) the *data.api* key to exclude *CreateFile* events. This ensure that file creation events with size 0 are filtered out, preventing ingestion errors of empty file. For more information about flushing, see [Azure Data Lake flush method](/dotnet/api/azure.storage.files.datalake.datalakefileclient.flush).
357
+
> * To reduce traffic coming from Event Grid and optimize the ingestion of events into Azure Data Explorer, we recommend [filtering](ingest-data-event-grid-manual.md#create-an-event-grid-subscription) the *data.api* key to exclude *CreateFile* events. This ensures that file creation events with size 0 are filtered out, preventing ingestion errors of empty file. For more information about flushing, see [Azure Data Lake flush method](/dotnet/api/azure.storage.files.datalake.datalakefileclient.flush).
358
+
> * Using the "OpenWrite" API to write to a blob is not recommended, as it triggers a notification for an empty blob and causes an empty-blob error. Additionally, flush the stream only once to prevent duplicate notifications and multiple ingestions of the same blob.
Copy file name to clipboardExpand all lines: data-explorer/ingest-data-event-grid-overview.md
+2-1Lines changed: 2 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -3,7 +3,7 @@ title: Ingest from storage using Event Grid subscription - Azure Data Explorer
3
3
description: This article describes Ingest from storage using Event Grid subscription in Azure Data Explorer.
4
4
ms.reviewer: leshalev
5
5
ms.topic: how-to
6
-
ms.date: 06/03/2024
6
+
ms.date: 06/10/2025
7
7
ms.custom: devx-track-azurepowershell
8
8
---
9
9
# Event Grid data connection
@@ -121,6 +121,7 @@ You can create a blob from a local file, set ingestion properties to the blob me
121
121
> * Using Azure Data Lake Gen2 storage SDK requires using `CreateFile` for uploading files and `Flush` at the end with the close parameter set to `true`. For a detailed example of Data Lake Gen2 SDK correct usage, see [Use the Event Grid data connection](create-event-grid-connection.md?tabs=azure-data-lake#use-the-event-grid-data-connection).
122
122
> * Triggering ingestion following a `CopyBlob` operation is not supported for storage accounts that have the hierarchical namespace feature enabled on them.
123
123
> * When the event hub endpoint doesn't acknowledge receipt of an event, Azure Event Grid activates a retry mechanism. If this retry delivery fails, Event Grid can deliver the undelivered events to a storage account using a process of *dead-lettering*. For more information, see [Event Grid message delivery and retry](/azure/event-grid/delivery-and-retry#retry-schedule-and-duration).
124
+
> * Using the "OpenWrite" API to write to a blob is not recommended, as it triggers a notification for an empty blob and causes an empty-blob error. Additionally, flush the stream only once to prevent duplicate notifications and multiple ingestions of the same blob.
description: Learn how to use the `.show queued ingestion operations` command to view a log of the queued ingestion operations that are currently running or completed.
@@ -45,6 +45,7 @@ The command returns a table with the latest update information for each ID.
45
45
|LastUpdatedOn |`datetime`|Date/time, in UTC, when the status was updated.|
46
46
|State |`string`|The state of the operation.|
47
47
|Discovered |`long`|Count of the blobs that were listed from storage and queued for ingestion.|
48
+
|DiscoveredSize |`long`|The total data size in bytes of all blobs that were listed from storage and queued for ingestion.|
48
49
|Pending |`long`|Count of the blobs to be ingested.|
49
50
|Canceled |`long`|Count of the blobs that were canceled due to a call to the [.cancel queued ingestion operation](cancel-queued-ingestion-operation-command.md) command.|
50
51
|Ingested |`long`|Count of the blobs that have been ingested.|
@@ -78,9 +79,9 @@ The following example shows the queued ingestion operations for a specific opera
78
79
79
80
**Output**
80
81
81
-
|IngestionOperationId|Started On |Last Updated On |State |Discovered |InProgress|Ingested |Failed|Canceled |SampleFailedReasons|Database|Table|
82
+
|IngestionOperationId|Started On |Last Updated On |State |Discovered |DiscoveredSize |InProgress|Ingested |Failed|Canceled |SampleFailedReasons|Database|Table|
82
83
|--|--|--|--|--|--|--|--|--|--|--|--|
83
-
|00001111;11112222;00001111-aaaa-2222-bbbb-3333cccc4444 |2025-01-10 14:57:41.0000000 |2025-01-10 14:57:41.0000000|InProgress | 10387 |9391 |995 |1 |0 | Stream with ID '*****.csv' has a malformed CSV format*|MyDatabase|MyTable|
84
+
|00001111;11112222;00001111-aaaa-2222-bbbb-3333cccc4444 |2025-01-10 14:57:41.0000000 |2025-01-10 14:57:41.0000000|InProgress | 10387 | 100547 |9391 |995 |1 |0 | Stream with ID '*****.csv' has a malformed CSV format*|MyDatabase|MyTable|
84
85
85
86
### Multiple operation IDs
86
87
@@ -92,10 +93,10 @@ The following example shows the queued ingestion operations for multiple operati
92
93
93
94
**Output**
94
95
95
-
|IngestionOperationId|Started On |Last Updated On |State |Discovered |InProgress|Ingested |Failed|Canceled |SampleFailedReasons|Database|Table|
96
+
|IngestionOperationId|Started On |Last Updated On |State |Discovered |DiscoveredSize |InProgress|Ingested |Failed|Canceled |SampleFailedReasons|Database|Table|
96
97
|--|--|--|--|--|--|--|--|--|--|--|--|
97
-
|00001111;11112222;00001111-aaaa-2222-bbbb-3333cccc4444 |2025-01-10 14:57:41.0000000 |2025-01-10 15:15:04.0000000|InProgress | 10387 |9391 |995 |1 |0 | Stream with ID '*****.csv' has a malformed CSV format*|MyDatabase|MyTable|
98
-
|11112222;22223333;11110000-bbbb-2222-cccc-3333dddd4444 |2025-01-10 15:12:23.0000000 |2025-01-10 15:15:16.0000000|InProgress | 25635 |25489 |145 |1 |0 | Unknown error occurred: Exception of type 'System.Exception' was thrown|MyDatabase|MyOtherTable|
98
+
|00001111;11112222;00001111-aaaa-2222-bbbb-3333cccc4444 |2025-01-10 14:57:41.0000000 |2025-01-10 15:15:04.0000000|InProgress | 10387 | 100547 |9391 |995 |1 |0 | Stream with ID '*****.csv' has a malformed CSV format*|MyDatabase|MyTable|
99
+
|11112222;22223333;11110000-bbbb-2222-cccc-3333dddd4444 |2025-01-10 15:12:23.0000000 |2025-01-10 15:15:16.0000000|InProgress | 25635 | 3545613 |25489 |145 |1 |0 | Unknown error occurred: Exception of type 'System.Exception' was thrown|MyDatabase|MyOtherTable|
0 commit comments