You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/stream-analytics/capture-event-hub-data-delta-lake.md
+7-7Lines changed: 7 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -30,30 +30,30 @@ Use the following steps to configure a Stream Analytics job to capture data in A
30
30
31
31
1. Enter a **name** to identify your Stream Analytics job. Select **Create**.
32
32
:::image type="content" source="./media/capture-event-hub-data-delta-lake/new-stream-analytics-job-name.png" alt-text="Screenshot showing the New Stream Analytics job window where you enter the job name." lightbox="./media/capture-event-hub-data-delta-lake/new-stream-analytics-job-name.png" :::
33
-
1. Specify the **Serialization** type of your data in the Event Hubs and the **Authentication method** that the job will use to connect to Event Hubs. Then select **Connect**.
33
+
1. Specify the **Serialization** type of your data in the Event Hubs and the **Authentication method** that the job uses to connect to Event Hubs. Then select **Connect**.
1. When the connection is established successfully, you'll see:
35
+
1. When the connection is established successfully, you see:
36
36
- Fields that are present in the input data. You can choose **Add field** or you can select the three dot symbol next to a field to optionally remove, rename, or change its name.
37
37
- A live sample of incoming data in the **Data preview** table under the diagram view. It refreshes periodically. You can select **Pause streaming preview** to view a static view of the sample input.
38
38
:::image type="content" source="./media/capture-event-hub-data-delta-lake/edit-fields.png" alt-text="Screenshot showing sample data under Data Preview." lightbox="./media/capture-event-hub-data-delta-lake/edit-fields.png" :::
39
39
1. Select the **Azure Data Lake Storage Gen2** tile to edit the configuration.
40
40
1. On the **Azure Data Lake Storage Gen2** configuration page, follow these steps:
41
-
1. Select the subscription, storage account name and container from the drop-down menu.
41
+
1. Select the subscription, storage account name, and container from the drop-down menu.
42
42
1. Once the subscription is selected, the authentication method and storage account key should be automatically filled in.
43
-
1. For **Delta table path**, it's used to specify the location and name of your Delta Lake table stored in Azure Data Lake Storage Gen2. You can choose to use one or more path segments to define the path to the delta table and the delta table name. To learn more, see to [Write to Delta Lake table (Public Preview)](./write-to-delta-lake.md).
43
+
1. For **Delta table path**, it's used to specify the location and name of your Delta Lake table stored in Azure Data Lake Storage Gen2. You can choose to use one or more path segments to define the path to the delta table and the delta table name. To learn more, see to [Write to Delta Lake table](./write-to-delta-lake.md).
44
44
1. Select **Connect**.
45
45
46
46
:::image type="content" source="./media/capture-event-hub-data-delta-lake/blob-configuration.png" alt-text="First screenshot showing the Blob window where you edit a blob's connection configuration." lightbox="./media/capture-event-hub-data-delta-lake/blob-configuration.png" :::
47
47
48
-
1. When the connection is established, you'll see fields that are present in the output data.
48
+
1. When the connection is established, you see fields that are present in the output data.
49
49
1. Select **Save** on the command bar to save your configuration.
50
50
1. Select **Start** on the command bar to start the streaming flow to capture data. Then in the Start Stream Analytics job window:
51
51
1. Choose the output start time.
52
52
1. Select the number of Streaming Units (SU) that the job runs with. SU represents the computing resources that are allocated to execute a Stream Analytics job. For more information, see [Streaming Units in Azure Stream Analytics](stream-analytics-streaming-unit-consumption.md).
53
53
:::image type="content" source="./media/capture-event-hub-data-delta-lake/start-job.png" alt-text="Screenshot showing the Start Stream Analytics job window where you set the output start time, streaming units, and error handling." lightbox="./media/capture-event-hub-data-delta-lake/start-job.png" :::
54
54
55
55
56
-
1. After you select **Start**, the job starts running within two minutes, and the metrics will be open in tab section below.
56
+
1. After you select **Start**, the job starts running within two minutes, and the metrics will be open in tab section as shown in the following image.
57
57
:::image type="content" source="./media/capture-event-hub-data-delta-lake/metrics-chart-in-tab-section.png" alt-text="Screenshot showing the metrics chart." lightbox="./media/capture-event-hub-data-delta-lake/metrics-chart-in-tab-section.png" :::
58
58
59
59
1. The new job can be seen on the **Stream Analytics jobs** tab.
@@ -63,7 +63,7 @@ Use the following steps to configure a Stream Analytics job to capture data in A
63
63
## Verify output
64
64
Verify that the parquet files with Delta lake format are generated in the Azure Data Lake Storage container.
65
65
66
-
:::image type="content" source="./media/capture-event-hub-data-delta-lake/verify-captured-data.png" alt-text="Screenshot showing the generated Parquet files in the ADLS container." lightbox="./media/capture-event-hub-data-delta-lake/verify-captured-data.png" :::
66
+
:::image type="content" source="./media/capture-event-hub-data-delta-lake/verify-captured-data.png" alt-text="Screenshot showing the generated Parquet files in the Azure Data Lake Storage (ADLS) container." lightbox="./media/capture-event-hub-data-delta-lake/verify-captured-data.png" :::
Copy file name to clipboardExpand all lines: articles/stream-analytics/no-code-stream-processing.md
+14-14Lines changed: 14 additions & 14 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -45,16 +45,16 @@ A Stream Analytics job is built on three main components: _streaming inputs_, _t
45
45
46
46
To access the no-code editor for building your stream analytics job, there are two approaches:
47
47
48
-
1.**Through Azure Stream Analytics portal (preview)**: Create a Stream Analytics job, and then select the no-code editor in the **Get started** tab in **Overview**blade, or select **No-code editor** in the left panel.
48
+
1.**Through Azure Stream Analytics portal (preview)**: Create a Stream Analytics job, and then select the no-code editor in the **Get started** tab in **Overview**page, or select **No-code editor** in the left panel.
49
49
50
-
:::image type="content" source="./media/no-code-stream-processing/no-code-on-asa-portal.png" alt-text="Screenshot that shows no-code on ASA portal locations." lightbox="./media/no-code-stream-processing/no-code-on-asa-portal.png" :::
50
+
:::image type="content" source="./media/no-code-stream-processing/no-code-on-asa-portal.png" alt-text="Screenshot that shows no-code on Azure Stream Analytics portal locations." lightbox="./media/no-code-stream-processing/no-code-on-asa-portal.png" :::
51
51
52
52
53
-
2.**Through Azure Event Hubs portal**: Open an Event Hubs instance. Select **Process Data**, and then select any pre-defined template.
53
+
2.**Through Azure Event Hubs portal**: Open an Event Hubs instance. Select **Process Data**, and then select any predefined template.
54
54
55
55
:::image type="content" source="./media/no-code-stream-processing/new-stream-analytics-job.png" alt-text="Screenshot that shows selections to create a new Stream Analytics job." lightbox="./media/no-code-stream-processing/new-stream-analytics-job.png" :::
56
56
57
-
The pre-defined templates can assist you in developing and running a job to address various scenarios, including:
57
+
The predefined templates can assist you in developing and running a job to address various scenarios, including:
58
58
59
59
-[Build real-time dashboard with Power BI dataset](./no-code-build-power-bi-dashboard.md)
60
60
-[Capture data from Event Hubs in Delta Lake format (preview)](./capture-event-hub-data-delta-lake.md)
@@ -66,14 +66,14 @@ To access the no-code editor for building your stream analytics job, there are t
66
66
-[Transform and store data to Azure SQL database](./no-code-transform-filter-ingest-sql.md)
67
67
-[Filter and ingest to Azure Data Explorer](./no-code-filter-ingest-data-explorer.md)
68
68
69
-
The following screenshot shows a completed Stream Analytics job. It highlights all the sections available to you while you author.
69
+
The following screenshot shows a completed Stream Analytics job. It highlights all the sections available to you as you author.
70
70
71
71
:::image type="content" source="./media/no-code-stream-processing/created-stream-analytics-job.png" alt-text="Screenshot that shows the authoring interface sections." lightbox="./media/no-code-stream-processing/created-stream-analytics-job.png" :::
72
72
73
-
1.**Ribbon**: On the ribbon, sections follow the order of a classic analytics process: an event hub as input (also known as a data source), transformations (streaming ETL operations), outputs, a button to save your progress, and a button to start the job.
73
+
1.**Ribbon**: On the ribbon, sections follow the order of a classic analytics process: an event hub as input (also known as a data source), transformations (streaming Etract, Transform, and Load operations), outputs, a button to save your progress, and a button to start the job.
74
74
2.**Diagram view**: This is a graphical representation of your Stream Analytics job, from input to operations to outputs.
75
-
3.**Side pane**: Depending on which component you selected in the diagram view, you'll have settings to modify input, transformation, or output.
76
-
4.**Tabs for data preview, authoring errors, runtime logs, and metrics**: For each tile, the data preview will show you results for that step (live for inputs; on demand for transformations and outputs). This section also summarizes any authoring errors or warnings that you might have in your job when it's being developed. Selecting each error or warning will select that transform. It also provides the job metrics for you to monitor the running job's health.
75
+
3.**Side pane**: Depending on which component you selected in the diagram view, you see settings to modify input, transformation, or output.
76
+
4.**Tabs for data preview, authoring errors, runtime logs, and metrics**: For each tile, the data preview shows you results for that step (live for inputs; on demand for transformations and outputs). This section also summarizes any authoring errors or warnings that you might have in your job when it's being developed. Selecting each error or warning selects that transform. It also provides the job metrics for you to monitor the running job's health.
77
77
78
78
## Streaming data input
79
79
@@ -102,22 +102,22 @@ If your event hub is in the Basic tier, you can use only the existing **$Default
102
102
103
103

104
104
105
-
When you're connecting to the event hub, if you select **Managed Identity** as the authentication mode, the Azure Event Hubs Data Owner role will be granted to the managed identity for the Stream Analytics job. To learn more about managed identities for an event hub, see [Use managed identities to access an event hub from an Azure Stream Analytics job](event-hubs-managed-identity.md).
105
+
When you're connecting to the event hub, if you select **Managed Identity** as the authentication mode, the Azure Event Hubs Data Owner role is granted to the managed identity for the Stream Analytics job. To learn more about managed identities for an event hub, see [Use managed identities to access an event hub from an Azure Stream Analytics job](event-hubs-managed-identity.md).
106
106
107
107
Managed identities eliminate the limitations of user-based authentication methods. These limitations include the need to reauthenticate because of password changes or user token expirations that occur every 90 days.
108
108
109
109

110
110
111
111
After you set up your event hub's details and select **Connect**, you can add fields manually by using **+ Add field** if you know the field names. To instead detect fields and data types automatically based on a sample of the incoming messages, select **Autodetect fields**. Selecting the gear symbol allows you to edit the credentials if needed.
112
112
113
-
When Stream Analytics jobs detect the fields, you'll see them in the list. You'll also see a live preview of the incoming messages in the **Data Preview** table under the diagram view.
113
+
When Stream Analytics jobs detect the fields, you see them in the list. You also see a live preview of the incoming messages in the **Data Preview** table under the diagram view.
114
114
115
115
#### Modify input data
116
116
117
117
You can edit the field names, or remove field, or change the data type, or change the event time (**Mark as event time**: TIMESTAMP BY clause if a datetime type field), by selecting the three-dot symbol next to each field. You can also expand, select, and edit any nested fields from the incoming messages, as shown in the following image.
118
118
119
119
> [!TIP]
120
-
> This applies to the input data from Azure IoT Hub and ADLS Gen2 as well.
120
+
> This applies to the input data from Azure IoT Hub and Azure Data Lake Storage Gen2 as well.
121
121
122
122
:::image type="content" source="./media/no-code-stream-processing/event-hub-schema.png" alt-text="Screenshot that shows selections for adding, removing, and editing the fields for an event hub." lightbox="./media/no-code-stream-processing/event-hub-schema.png" :::
123
123
@@ -298,7 +298,7 @@ Managed identities eliminate the limitations of user-based authentication method
298
298
299
299
:::image type="content" source="./media/no-code-stream-processing/exactly-once-delivery-adls.png" alt-text="Screenshot that shows the exactly once configuration in ADLS Gen2 output." lightbox="./media/no-code-stream-processing/exactly-once-delivery-adls.png" :::
300
300
301
-
**Write to Delta Lake table (preview)** is supported in the ADLS Gen2 as no code editor output. You can access this option in section **Serialization** in ADLS Gen2 configuration. For more information about this feature, see [Write to Delta Lake table (Public Preview)](./write-to-delta-lake.md).
301
+
**Write to Delta Lake table (preview)** is supported in the ADLS Gen2 as no code editor output. You can access this option in section **Serialization** in ADLS Gen2 configuration. For more information about this feature, see [Write to Delta Lake table](./write-to-delta-lake.md).
302
302
303
303
:::image type="content" source="./media/no-code-stream-processing/delta-lake-format-output-in-adls.png" alt-text="Screenshot that shows the delta lake configuration in ADLS Gen2 output." lightbox="./media/no-code-stream-processing/delta-lake-format-output-in-adls.png" :::
304
304
@@ -332,9 +332,9 @@ To configure Azure SQL Database as output, select **SQL Database** under the **O
332
332
333
333
For more information about Azure SQL Database output for a Stream Analytics job, see [Azure SQL Database output from Azure Stream Analytics](./sql-database-output.md).
334
334
335
-
### Event Hub
335
+
### Event Hubs
336
336
337
-
With the real-time data coming through event hub to ASA, no-code editor can transform, enrich the data and then output the data to another event hub as well. You can choose the **Event Hub** output when you configure your Azure Stream Analytics job.
337
+
With the real-time data coming through to ASA, no-code editor can transform, enrich the data and then output the data to another event hub as well. You can choose the **Event Hubs** output when you configure your Azure Stream Analytics job.
338
338
339
339
To configure Event Hubs as output, select **Event Hub** under the Outputs section on the ribbon. Then fill in the needed information to connect your event hub that you want to write data to.
0 commit comments