You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/stream-analytics/capture-event-hub-data-delta-lake.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,7 +6,7 @@ ms.author: xujiang1
6
6
ms.service: stream-analytics
7
7
ms.topic: how-to
8
8
ms.custom: mvc, event-tier1-build-2022
9
-
ms.date: 12/18/2022
9
+
ms.date: 2/17/2023
10
10
---
11
11
# Capture data from Event Hubs in Delta Lake format (preview)
12
12
@@ -40,7 +40,7 @@ Use the following steps to configure a Stream Analytics job to capture data in A
40
40
1. On the **Azure Data Lake Storage Gen2** configuration page, follow these steps:
41
41
1. Select the subscription, storage account name and container from the drop-down menu.
42
42
1. Once the subscription is selected, the authentication method and storage account key should be automatically filled in.
43
-
1. For **Delta table path**, it's used to specify the location and name of your Delta Lake table stored in Azure Data Lake Storage Gen2. You can choose to use one or more path segments to define the path to the delta table and the delta table name. To learn more, see to [Write to Delta Lake table](./write-to-delta-lake.md).
43
+
1. For **Delta table path**, it's used to specify the location and name of your Delta Lake table stored in Azure Data Lake Storage Gen2. You can choose to use one or more path segments to define the path to the delta table and the delta table name. To learn more, see to [Write to Delta Lake table (Public Preview)](./write-to-delta-lake.md).
44
44
1. Select **Connect**.
45
45
46
46
:::image type="content" source="./media/capture-event-hub-data-delta-lake/blob-configuration.png" alt-text="First screenshot showing the Blob window where you edit a blob's connection configuration." lightbox="./media/capture-event-hub-data-delta-lake/blob-configuration.png" :::
Copy file name to clipboardExpand all lines: articles/stream-analytics/job-diagram-with-metrics.md
+4-4Lines changed: 4 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -7,7 +7,7 @@ ms.author: xujiang1
7
7
ms.service: stream-analytics
8
8
ms.custom: ignite-2022
9
9
ms.topic: conceptual
10
-
ms.date: 12/8/2022
10
+
ms.date: 2/17/2023
11
11
---
12
12
13
13
# Stream Analytics job diagram (preview) in Azure portal
@@ -86,9 +86,9 @@ The processor diagram in physical job diagram visualizes the processor topology
86
86
| --- | --- |
87
87
|**Input** or **Output**| This processor is used for reading input or writing output data streams. |
88
88
|**ReferenceData**| This processor is used for fetching the reference data. |
89
-
|**Computing**| This processor is used for processing the stream data according to the query logic, for example, aggregating, filtering, grouping with window, etc.. To learn more about the stream data computation query functions, see [Azure Stream Analytics Query Language Reference](/stream-analytics-query/stream-analytics-query-language-reference). |
89
+
|**Computing**| This processor is used for processing the stream data according to the query logic, for example, aggregating, filtering, grouping with window, etc. To learn more about the stream data computation query functions, see [Azure Stream Analytics Query Language Reference](/stream-analytics-query/stream-analytics-query-language-reference). |
90
90
|**MarshallerUpstream** and **MarshallerDownstream**| When there's stream data interaction among streaming nodes, there will be two marshaller processors: 1). **MarshallerUpstream** for sending the data in the upstream streaming node and 2). **MarshallerDownstream** for receiving the data in the downstream streaming node. |
91
-
|**Merger**| This processor is to receive the crossing-partition stream data, which were outputted from several upstream streaming nodes. The best practice to optimize job performance is to update query to remove the merger processor to make the job become parallel since the merger processor is the bottleneck of the job. The job diagram simulator feature within VSCode ASA extension can help you simulating your query locally when you optimizing your job query. To learn more, see [Optimize query using job diagram simulator (preview)](./optimize-query-using-job-diagram-simulator.md). |
91
+
|**Merger**| This processor is to receive the crossing-partition stream data, which were outputted from several upstream streaming nodes. The best practice to optimize job performance is to update query to remove the merger processor to make the job become parallel since the merger processor is the bottleneck of the job. The job diagram simulator feature within Visual Studio Code ASA extension can help you simulating your query locally when you optimizing your job query. To learn more, see [Optimize query using job diagram simulator (preview)](./optimize-query-using-job-diagram-simulator.md). |
92
92
|
93
93
94
94
@@ -97,7 +97,7 @@ The processor diagram in physical job diagram visualizes the processor topology
97
97
98
98
***Adapter type**: it shows the type of the input or output adapter. Stream Analytics supports various input sources and output destinations. Each input source or output destination has a dedicated adapter type. It's only available in input processor and output processor. For example, "InputBlob" represents the ADLS Gen2 input where the input processor receives the data from; "OutputDocumentDb" represents the Cosmos DB output where the output processor outputs the data to.
99
99
100
-
To learn more details of the input and output types, see [Azure Stream Analytics inputs overview](./stream-analytics-define-inputs.md), and [Azure Stream Analytics outputs overview](./stream-analytics-define-outputs.md)
100
+
To learn more details of the input and output types, see [Azure Stream Analytics inputs overview](./stream-analytics-define-inputs.md), and [Azure Stream Analytics outputs overview](./stream-analytics-define-outputs.md).
101
101
102
102
***Partition IDs**: it shows which partition IDs' data are being processed by this processor. It's only available in input processor and output processor.
103
103
***Serializer type**: it shows the type of the serialization. Stream Analytics supports several [serialization types](./stream-analytics-define-inputs.md). It's only available in input processor and output processor.
0 commit comments