Skip to content

Commit 6492a9e

Browse files
committed
added more features for no-code
1 parent 2f1e49a commit 6492a9e

File tree

4 files changed

+12
-4
lines changed

4 files changed

+12
-4
lines changed

articles/stream-analytics/capture-event-hub-data-delta-lake.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ ms.author: xujiang1
66
ms.service: stream-analytics
77
ms.topic: how-to
88
ms.custom: mvc, event-tier1-build-2022
9-
ms.date: 12/18/2022
9+
ms.date: 2/17/2023
1010
---
1111
# Capture data from Event Hubs in Delta Lake format (preview)
1212

@@ -40,7 +40,7 @@ Use the following steps to configure a Stream Analytics job to capture data in A
4040
1. On the **Azure Data Lake Storage Gen2** configuration page, follow these steps:
4141
1. Select the subscription, storage account name and container from the drop-down menu.
4242
1. Once the subscription is selected, the authentication method and storage account key should be automatically filled in.
43-
1. For **Delta table path**, it's used to specify the location and name of your Delta Lake table stored in Azure Data Lake Storage Gen2. You can choose to use one or more path segments to define the path to the delta table and the delta table name. To learn more, see to [Write to Delta Lake table](./write-to-delta-lake.md).
43+
1. For **Delta table path**, it's used to specify the location and name of your Delta Lake table stored in Azure Data Lake Storage Gen2. You can choose to use one or more path segments to define the path to the delta table and the delta table name. To learn more, see to [Write to Delta Lake table (Public Preview)](./write-to-delta-lake.md).
4444
1. Select **Connect**.
4545

4646
:::image type="content" source="./media/capture-event-hub-data-delta-lake/blob-configuration.png" alt-text="First screenshot showing the Blob window where you edit a blob's connection configuration." lightbox="./media/capture-event-hub-data-delta-lake/blob-configuration.png" :::
42.5 KB
Loading
54.3 KB
Loading

articles/stream-analytics/no-code-stream-processing.md

Lines changed: 10 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ ms.author: xujiang1
66
ms.service: stream-analytics
77
ms.topic: how-to
88
ms.custom: mvc, event-tier1-build-2022, ignite-2022
9-
ms.date: 10/12/2022
9+
ms.date: 2/17/2023
1010
---
1111

1212
# No-code stream processing through Azure Stream Analytics
@@ -229,10 +229,18 @@ Under the **Outputs** section on the ribbon, select **ADLS Gen2** as the output
229229

230230
When you're connecting to Azure Data Lake Storage Gen2, if you select **Managed Identity** as the authentication mode, then the Storage Blob Data Contributor role will be granted to the managed identity for the Stream Analytics job. To learn more about managed identities for Azure Data Lake Storage Gen2, see [Use managed identities to authenticate your Azure Stream Analytics job to Azure Blob Storage](blob-output-managed-identity.md).
231231

232-
Managed identities eliminate the limitations of user-based authentication methods. These limitations include the need to reauthenticate because of password changes or user token expirations that occur every 90 days.
232+
Managed identities eliminate the limitations of user-based authentication methods. These limitations include the need to reauthenticate because of password changes or user token expirations that occur every 90 days.
233233

234234
![Screenshot that shows selecting managed identity as the authentication method for Azure Data Lake Storage Gen2](./media/no-code-stream-processing/msi-adls-nocode.png)
235235

236+
**Exactly once delivery (preview)** is supported in the ADLS Gen2 as no code editor output. You can enable it in the **Write mode** section in ADLS Gen2 configuration. For more information about this feature, see [Exactly once delivery (preview) in Azure Data Lake Gen2](./blob-storage-azure-data-lake-gen2-output.md#exactly-once-delivery-public-preview)
237+
238+
:::image type="content" source="./media/no-code-stream-processing/exactly-once-delivery-adls-gen2.png" alt-text="Screenshot that shows the exactly once configuration in ADLS Gen2 output." lightbox="./media/no-code-stream-processing/exactly-once-delivery-adls-gen2.png" :::
239+
240+
**Write to Delta Lake table (preview)** is supported in the ADLS Gen2 as no code editor output. You can access this option in section **Serialization** in ADLS Gen2 configuration. For more information about this feature, see [Write to Delta Lake table (Public Preview)](./write-to-delta-lake.md).
241+
242+
:::image type="content" source="./media/no-code-stream-processing/delta-lake-format-output-in-adls-gen2.png" alt-text="Screenshot that shows the delta lake configuration in ADLS Gen2 output." lightbox="./media/no-code-stream-processing/delta-lake-format-output-in-adls-gen2.png" :::
243+
236244
### Azure Synapse Analytics
237245

238246
Azure Stream Analytics jobs can send output to a dedicated SQL pool table in Azure Synapse Analytics and can process throughput rates up to 200 MB per second. Stream Analytics supports the most demanding real-time analytics and hot-path data processing needs for workloads like reporting and dashboarding.

0 commit comments

Comments
 (0)