Skip to content

Commit 1dbc901

Browse files
authored
Merge pull request #227588 from xujxu/nocode-pbi-output-update
Nocode pbi output update
2 parents 81ddb60 + d702ba4 commit 1dbc901

33 files changed

+143
-24
lines changed

articles/event-hubs/TOC.yml

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -72,7 +72,7 @@
7272
- name: Capture Event Hubs data in Parquet format
7373
href: ../stream-analytics/event-hubs-parquet-capture-tutorial.md?toc=/azure/event-hubs/TOC.json
7474
maintainContext: true
75-
- name: Build real time dashboards with Power BI
75+
- name: Build real-time dashboards with Power BI
7676
href: ../stream-analytics/no-code-power-bi-tutorial.md
7777
maintainContext: true
7878
- name: Process Apache Kafka for Event Hubs events using Stream analytics
@@ -210,6 +210,8 @@
210210

211211
- name: Process data
212212
items:
213+
- name: Build real-time dashboard with Power BI dataset
214+
href: ../stream-analytics/no-code-build-power-bi-dashboard.md?toc=/azure/event-hubs/toc.json
213215
- name: Capture Event Hubs data in Delta Lake format
214216
href: ../stream-analytics/capture-event-hub-data-delta-lake.md?toc=/azure/event-hubs/toc.json
215217
- name: Capture Event Hubs data in Parquet format

articles/stream-analytics/TOC.yml

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -37,7 +37,7 @@
3737
href: event-hubs-parquet-capture-tutorial.md
3838
- name: Write output data to delta table in ADLS Gen2
3939
href: write-to-delta-table-adls-gen2.md
40-
- name: Build near-real time dashboards with no code editor
40+
- name: Build real-time dashboards with no code editor
4141
href: no-code-power-bi-tutorial.md
4242
- name: Visualize fraudulent calls in Power BI
4343
href: stream-analytics-real-time-fraud-detection.md
@@ -235,6 +235,8 @@
235235
href: move-cluster.md
236236
- name: Build with no code editor
237237
items:
238+
- name: Build real-time dashboard with Power BI dataset
239+
href: no-code-build-power-bi-dashboard.md
238240
- name: Capture Event Hubs data in Delta Lake format
239241
href: capture-event-hub-data-delta-lake.md
240242
- name: Capture Event Hubs data in Parquet format

articles/stream-analytics/capture-event-hub-data-delta-lake.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ ms.author: xujiang1
66
ms.service: stream-analytics
77
ms.topic: how-to
88
ms.custom: mvc, event-tier1-build-2022
9-
ms.date: 12/18/2022
9+
ms.date: 2/17/2023
1010
---
1111
# Capture data from Event Hubs in Delta Lake format (preview)
1212

@@ -40,7 +40,7 @@ Use the following steps to configure a Stream Analytics job to capture data in A
4040
1. On the **Azure Data Lake Storage Gen2** configuration page, follow these steps:
4141
1. Select the subscription, storage account name and container from the drop-down menu.
4242
1. Once the subscription is selected, the authentication method and storage account key should be automatically filled in.
43-
1. For **Delta table path**, it's used to specify the location and name of your Delta Lake table stored in Azure Data Lake Storage Gen2. You can choose to use one or more path segments to define the path to the delta table and the delta table name. To learn more, see to [Write to Delta Lake table](./write-to-delta-lake.md).
43+
1. For **Delta table path**, it's used to specify the location and name of your Delta Lake table stored in Azure Data Lake Storage Gen2. You can choose to use one or more path segments to define the path to the delta table and the delta table name. To learn more, see to [Write to Delta Lake table (Public Preview)](./write-to-delta-lake.md).
4444
1. Select **Connect**.
4545

4646
:::image type="content" source="./media/capture-event-hub-data-delta-lake/blob-configuration.png" alt-text="First screenshot showing the Blob window where you edit a blob's connection configuration." lightbox="./media/capture-event-hub-data-delta-lake/blob-configuration.png" :::

articles/stream-analytics/job-diagram-with-metrics.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ ms.author: xujiang1
77
ms.service: stream-analytics
88
ms.custom: ignite-2022
99
ms.topic: conceptual
10-
ms.date: 12/8/2022
10+
ms.date: 2/17/2023
1111
---
1212

1313
# Stream Analytics job diagram (preview) in Azure portal
@@ -86,9 +86,9 @@ The processor diagram in physical job diagram visualizes the processor topology
8686
| --- | --- |
8787
| **Input** or **Output** | This processor is used for reading input or writing output data streams. |
8888
| **ReferenceData** | This processor is used for fetching the reference data. |
89-
| **Computing** | This processor is used for processing the stream data according to the query logic, for example, aggregating, filtering, grouping with window, etc.. To learn more about the stream data computation query functions, see [Azure Stream Analytics Query Language Reference](/stream-analytics-query/stream-analytics-query-language-reference). |
89+
| **Computing** | This processor is used for processing the stream data according to the query logic, for example, aggregating, filtering, grouping with window, etc. To learn more about the stream data computation query functions, see [Azure Stream Analytics Query Language Reference](/stream-analytics-query/stream-analytics-query-language-reference). |
9090
| **MarshallerUpstream** and **MarshallerDownstream** | When there's stream data interaction among streaming nodes, there will be two marshaller processors: 1). **MarshallerUpstream** for sending the data in the upstream streaming node and 2). **MarshallerDownstream** for receiving the data in the downstream streaming node. |
91-
| **Merger** | This processor is to receive the crossing-partition stream data, which were outputted from several upstream streaming nodes. The best practice to optimize job performance is to update query to remove the merger processor to make the job become parallel since the merger processor is the bottleneck of the job. The job diagram simulator feature within VSCode ASA extension can help you simulating your query locally when you optimizing your job query. To learn more, see [Optimize query using job diagram simulator (preview)](./optimize-query-using-job-diagram-simulator.md). |
91+
| **Merger** | This processor is to receive the crossing-partition stream data, which were outputted from several upstream streaming nodes. The best practice to optimize job performance is to update query to remove the merger processor to make the job become parallel since the merger processor is the bottleneck of the job. The job diagram simulator feature within Visual Studio Code ASA extension can help you simulating your query locally when you optimizing your job query. To learn more, see [Optimize query using job diagram simulator (preview)](./optimize-query-using-job-diagram-simulator.md). |
9292
|
9393

9494

@@ -97,7 +97,7 @@ The processor diagram in physical job diagram visualizes the processor topology
9797

9898
* **Adapter type**: it shows the type of the input or output adapter. Stream Analytics supports various input sources and output destinations. Each input source or output destination has a dedicated adapter type. It's only available in input processor and output processor. For example, "InputBlob" represents the ADLS Gen2 input where the input processor receives the data from; "OutputDocumentDb" represents the Cosmos DB output where the output processor outputs the data to.
9999

100-
To learn more details of the input and output types, see [Azure Stream Analytics inputs overview](./stream-analytics-define-inputs.md), and [Azure Stream Analytics outputs overview](./stream-analytics-define-outputs.md)
100+
To learn more details of the input and output types, see [Azure Stream Analytics inputs overview](./stream-analytics-define-inputs.md), and [Azure Stream Analytics outputs overview](./stream-analytics-define-outputs.md).
101101

102102
* **Partition IDs**: it shows which partition IDs' data are being processed by this processor. It's only available in input processor and output processor.
103103
* **Serializer type**: it shows the type of the serialization. Stream Analytics supports several [serialization types](./stream-analytics-define-inputs.md). It's only available in input processor and output processor.
315 Bytes
Loading
38 Bytes
Loading
199 Bytes
Loading
234 Bytes
Loading
-1.75 KB
Loading
39.7 KB
Loading

0 commit comments

Comments
 (0)