Skip to content

Commit e8567bb

Browse files
committed
Acrolynx
1 parent d3259c1 commit e8567bb

File tree

3 files changed

+4
-4
lines changed

3 files changed

+4
-4
lines changed

articles/stream-analytics/capture-event-hub-data-parquet.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -43,7 +43,7 @@ Use the following steps to configure a Stream Analytics job to capture data in A
4343
:::image type="content" source="./media/capture-event-hub-data-parquet/edit-fields.png" alt-text="Screenshot showing sample data under Data Preview." lightbox="./media/capture-event-hub-data-parquet/edit-fields.png" :::
4444
1. Select the **Azure Data Lake Storage Gen2** tile to edit the configuration.
4545
1. On the **Azure Data Lake Storage Gen2** configuration page, follow these steps:
46-
1. Select the subscription, storage account name and container from the drop-down menu.
46+
1. Select the subscription, storage account name, and container from the drop-down menu.
4747
1. Once the subscription is selected, the authentication method and storage account key should be automatically filled in.
4848
1. Select **Parquet** for **Serialization** format.
4949

@@ -72,7 +72,7 @@ Use the following steps to configure a Stream Analytics job to capture data in A
7272
1. On the Event Hubs instance page for your event hub, select **Generate data**, select **Stocks data** for dataset, and then select **Send** to send some sample data to the event hub.
7373
1. Verify that the Parquet files are generated in the Azure Data Lake Storage container.
7474

75-
:::image type="content" source="./media/capture-event-hub-data-parquet/verify-captured-data.png" alt-text="Screenshot showing the generated Parquet files in the ADLS container." lightbox="./media/capture-event-hub-data-parquet/verify-captured-data.png" :::
75+
:::image type="content" source="./media/capture-event-hub-data-parquet/verify-captured-data.png" alt-text="Screenshot showing the generated Parquet files in the Azure Data Lake Storage container." lightbox="./media/capture-event-hub-data-parquet/verify-captured-data.png" :::
7676
1. Select **Process data** on the left menu. Switch to the **Stream Analytics jobs** tab. Select **Open metrics** to monitor it.
7777

7878
:::image type="content" source="./media/capture-event-hub-data-parquet/open-metrics-link.png" alt-text="Screenshot showing Open Metrics link selected." lightbox="./media/capture-event-hub-data-parquet/open-metrics-link.png" :::

articles/stream-analytics/no-code-filter-ingest-data-explorer.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,7 @@ This article describes how you can use the no code editor to easily create a Str
2222
1. In the [Azure portal](https://portal.azure.com), locate and select the Azure Event Hubs instance.
2323
1. Select **Features** > **Process Data** and then select **Start** on the **Filter and store data to Azure Data Explorer** card.
2424

25-
:::image type="content" source="./media/no-code-filter-ingest-data-explorer/event-hub-process-data-templates.png" alt-text="Screenshot showing the Filter and ingest to ADLS Gen2 card where you select Start." lightbox="./media/no-code-filter-ingest-data-explorer/event-hub-process-data-templates.png" :::
25+
:::image type="content" source="./media/no-code-filter-ingest-data-explorer/event-hub-process-data-templates.png" alt-text="Screenshot showing the Filter and ingest to Azure Data Lake Storage Gen2 card where you select Start." lightbox="./media/no-code-filter-ingest-data-explorer/event-hub-process-data-templates.png" :::
2626

2727
1. Enter a name for the Stream Analytics job, then select **Create**.
2828

articles/stream-analytics/no-code-materialize-cosmos-db.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -45,7 +45,7 @@ To start the job, you must specify:
4545

4646
## Next steps
4747

48-
Now you know how to use the Stream Analytics no code editor to develop a job that reads from Event Hubs and calculates aggregates such as counts, averages and writes it to your Azure Cosmos DB resource.
48+
Now you know how to use the Stream Analytics no code editor to develop a job that reads from Event Hubs and calculates aggregates such as counts, averages, and writes it to your Azure Cosmos DB resource.
4949

5050
* [Introduction to Azure Stream Analytics](stream-analytics-introduction.md)
5151
* [Monitor Stream Analytics job with Azure portal](./stream-analytics-monitoring.md)

0 commit comments

Comments
 (0)