Skip to content

Commit e29796a

Browse files
committed
update some docs for job diagram
1 parent 6492a9e commit e29796a

8 files changed

+5
-3
lines changed

articles/stream-analytics/job-diagram-with-metrics.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ ms.author: xujiang1
77
ms.service: stream-analytics
88
ms.custom: ignite-2022
99
ms.topic: conceptual
10-
ms.date: 12/8/2022
10+
ms.date: 2/17/2023
1111
---
1212

1313
# Stream Analytics job diagram (preview) in Azure portal
@@ -97,7 +97,7 @@ The processor diagram in physical job diagram visualizes the processor topology
9797

9898
* **Adapter type**: it shows the type of the input or output adapter. Stream Analytics supports various input sources and output destinations. Each input source or output destination has a dedicated adapter type. It's only available in input processor and output processor. For example, "InputBlob" represents the ADLS Gen2 input where the input processor receives the data from; "OutputDocumentDb" represents the Cosmos DB output where the output processor outputs the data to.
9999

100-
To learn more details of the input and output types, see [Azure Stream Analytics inputs overview](./stream-analytics-define-inputs.md), and [Azure Stream Analytics outputs overview](./stream-analytics-define-outputs.md)
100+
To learn more details of the input and output types, see [Azure Stream Analytics inputs overview](./stream-analytics-define-inputs.md), and [Azure Stream Analytics outputs overview](./stream-analytics-define-outputs.md).
101101

102102
* **Partition IDs**: it shows which partition IDs' data are being processed by this processor. It's only available in input processor and output processor.
103103
* **Serializer type**: it shows the type of the serialization. Stream Analytics supports several [serialization types](./stream-analytics-define-inputs.md). It's only available in input processor and output processor.
315 Bytes
Loading
38 Bytes
Loading
199 Bytes
Loading
234 Bytes
Loading
-1.75 KB
Loading

articles/stream-analytics/no-code-stream-processing.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -13,6 +13,8 @@ ms.date: 2/17/2023
1313

1414
You can process your real-time data streams in Azure Event Hubs by using Azure Stream Analytics. The no-code editor allows you to develop a Stream Analytics job without writing a single line of code. In minutes, you can develop and run a job that tackles many scenarios, including:
1515

16+
- [Build real-time dashboard with Power BI dataset](./no-code-build-power-bi-dashboard.md)
17+
- [Capture data from Event Hubs in Delta Lake format (preview)](./capture-event-hub-data-delta-lake.md)
1618
- [Filtering and ingesting to Azure Synapse SQL](./filter-ingest-synapse-sql.md)
1719
- [Capturing your Event Hubs data in Parquet format in Azure Data Lake Storage Gen2](./capture-event-hub-data-parquet.md)
1820
- [Materializing data in Azure Cosmos DB](./no-code-materialize-cosmos-db.md)

articles/stream-analytics/stream-analytics-job-physical-diagram-with-metrics.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ ms.author: xujiang1
77
ms.service: stream-analytics
88
ms.custom: ignite-2022
99
ms.topic: how-to
10-
ms.date: 10/12/2022
10+
ms.date: 2/17/2023
1111
---
1212

1313
# Debug using the physical job diagram (preview) in Azure portal

0 commit comments

Comments
 (0)