You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/stream-analytics/event-ordering.md
+4-4Lines changed: 4 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,7 +6,7 @@ ms.author: sidram
6
6
7
7
ms.service: stream-analytics
8
8
ms.topic: how-to
9
-
ms.date: 08/06/2020
9
+
ms.date: 08/26/2022
10
10
---
11
11
# Configuring event ordering policies for Azure Stream Analytics
12
12
@@ -32,15 +32,15 @@ If events arrive late or out-of-order based on the policies you've configured, y
32
32
33
33
Let us see an example of these policies in action.
34
34
<br> **Late arrival policy:** 15 seconds
35
-
<br> **Out-of-order policy:**8 seconds
35
+
<br> **Out-of-order policy:**5 seconds
36
36
37
37
| Event No. | Event Time | Arrival Time | System.Timestamp | Explanation |
38
38
| --- | --- | --- | --- | --- |
39
39
|**1**| 00:10:00 | 00:10:40 | 00:10:25 | Event arrived late and outside tolerance level. So event time gets adjusted to maximum late arrival tolerance. |
40
40
|**2**| 00:10:30 | 00:10:41 | 00:10:30 | Event arrived late but within tolerance level. So event time doesn't get adjusted. |
41
41
|**3**| 00:10:42 | 00:10:42 | 00:10:42 | Event arrived on time. No adjustment needed. |
42
-
|**4**| 00:10:38 | 00:10:43 | 00:10:38 | Event arrived out-of-order but within the tolerance of 8 seconds. So, event time doesn't get adjusted. For analytics purposes, this event will be considered as preceding event number 4. |
43
-
|**5**| 00:10:35 | 00:10:45 | 00:10:37 | Event arrived out-of-order and outside tolerance of 8 seconds. So, event time is adjusted to maximum of out-of-order tolerance. |
42
+
|**4**| 00:10:38 | 00:10:43 | 00:10:38 | Event arrived out-of-order but within the tolerance of 5 seconds. So, event time doesn't get adjusted. For analytics purposes, this event will be considered as preceding event number 4 (with considering the total 5 events. The actual order is: 1, 2, 5, 4, 3). |
43
+
|**5**| 00:10:35 | 00:10:45 | 00:10:37 | Event arrived out-of-order and outside tolerance of 5 seconds. So, event time is adjusted to maximum of out-of-order tolerance. |
Copy file name to clipboardExpand all lines: articles/stream-analytics/no-code-stream-processing.md
+30-26Lines changed: 30 additions & 26 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,7 +6,7 @@ ms.author: sidram
6
6
ms.service: stream-analytics
7
7
ms.topic: how-to
8
8
ms.custom: mvc, event-tier1-build-2022
9
-
ms.date: 05/08/2022
9
+
ms.date: 08/26/2022
10
10
---
11
11
12
12
# No code stream processing using Azure Stream Analytics (Preview)
@@ -50,7 +50,7 @@ The following screenshot shows a finished Stream Analytics job. It highlights al
50
50
1.**Ribbon** - On the ribbon, sections follow the order of a classic/ analytics process: Event Hubs as input (also known as data source), transformations (streaming ETL operations), outputs, a button to save your progress and a button to start the job.
51
51
2.**Diagram view** - A graphical representation of your Stream Analytics job, from input to operations to outputs.
52
52
3.**Side pane** - Depending on which component you selected in the diagram view, you'll have settings to modify input, transformation, or output.
53
-
4.**Tabs for data preview, authoring errors, and runtime errors** - For each tile shown, the data preview will show you results for that step (live for inputs and on-demand for transformations and outputs). This section also summarizes any authoring errors or warnings that you might have in your job when it's being developed. Selecting each error or warning will select that transform.
53
+
4.**Tabs for data preview, authoring errors, runtime logs, and metrics** - For each tile shown, the data preview will show you results for that step (live for inputs and on-demand for transformations and outputs). This section also summarizes any authoring errors or warnings that you might have in your job when it's being developed. Selecting each error or warning will select that transform. It also provides the job metrics for you to monitor running job's health.
54
54
55
55
## Event Hubs as the streaming input
56
56
@@ -94,7 +94,7 @@ No-code editor now supports two reference data sources:
94
94
95
95
Reference data is modeled as a sequence of blobs in ascending order of the date/time specified in the blob name. Blobs can only be added to the end of the sequence by using a date/time greater than the one specified by the last blob in the sequence. Blobs are defined in the input configuration. For more information, see [Use reference data from Blob Storage for a Stream Analytics job](stream-analytics-use-reference-data.md).
96
96
97
-
First, you have to select your ADLS Gen2. To see details about each field, see Azure Blob Storage section in [Azure Blob Storage Reference data input](stream-analytics-use-reference-data#azure-blob-storage.md).
97
+
First, you have to select **Reference ADLS Gen2** under **Inputs** section on the ribbon. To see details about each field, see Azure Blob Storage section in [Azure Blob Storage Reference data input](stream-analytics-use-reference-data#azure-blob-storage).
98
98
99
99

100
100
@@ -104,9 +104,9 @@ Then, upload a JSON of array file and the fields in the file will be detected. U
104
104
105
105
### SQL Database as reference data
106
106
107
-
Azure Stream Analytics supports Azure SQL Database as a source of input for reference data as well. For more information, see [Azure SQL Database Reference data input](stream-analytics-use-reference-data#azure-sql-database.md). You can use SQL Database as reference data for your Stream Analytics job in the no-code editor.
107
+
Azure Stream Analytics supports Azure SQL Database as a source of input for reference data as well. For more information, see [Azure SQL Database Reference data input](stream-analytics-use-reference-data#azure-sql-database). You can use SQL Database as reference data for your Stream Analytics job in the no-code editor.
108
108
109
-
To configure SQL database as reference data input, simply select the "**Reference SQL Database**" under "**Input**" section in no-code editor ribbon.
109
+
To configure SQL database as reference data input, simply select the **Reference SQL Database** under **Inputs** section on the ribbon.
110
110
111
111
Then fill in the needed information to connect your reference database and select the table with your needed columns. You can also fetch the reference data from your table by editing the SQL query manually.
112
112
@@ -116,7 +116,9 @@ Then fill in the needed information to connect your reference database and selec
116
116
117
117
Streaming data transformations are inherently different from batch data transformations. Almost all streaming data has a time component, which affects any data preparation tasks involved.
118
118
119
-
To add a streaming data transformation to your job, select the transformation symbol on the ribbon for that transformation. The respective tile will be dropped in the diagram view. After you select it, you'll see the side pane for that transformation to configure it.
119
+
To add a streaming data transformation to your job, select the transformation symbol under **Operations** section on the ribbon for that transformation. The respective tile will be dropped in the diagram view. After you select it, you'll see the side pane for that transformation to configure it.
120
+
121
+
:::image type="content" source="./media/no-code-stream-processing/transformation-operations.png" alt-text="Screenshot showing the transformation operations." lightbox="./media/no-code-stream-processing/transformation-operations.png" :::
120
122
121
123
### Filter
122
124
@@ -211,7 +213,7 @@ The no-code drag-and-drop experience currently supports several output sinks to
211
213
212
214
Data Lake Storage Gen2 makes Azure Storage the foundation for building enterprise data lakes on Azure. It's designed from the start to service multiple petabytes of information while sustaining hundreds of gigabits of throughput. It allows you to easily manage massive amounts of data. Azure Blob storage offers a cost-effective and scalable solution for storing large amounts of unstructured data in the cloud.
213
215
214
-
Select **ADLS Gen2** as output for your Stream Analytics job and select the container where you want to send the output of the job. For more information about Azure Data Lake Gen2 output for a Stream Analytics job, see [Blob storage and Azure Data Lake Gen2 output from Azure Stream Analytics](blob-storage-azure-data-lake-gen2-output.md).
216
+
Select **ADLS Gen2**under **Outputs** section on the ribbon as output for your Stream Analytics job and select the container where you want to send the output of the job. For more information about Azure Data Lake Gen2 output for a Stream Analytics job, see [Blob storage and Azure Data Lake Gen2 output from Azure Stream Analytics](blob-storage-azure-data-lake-gen2-output.md).
215
217
216
218
When connecting to ADLS Gen2, if you choose ‘Managed Identity’ as Authentication mode, then the Storage Blob Data Contributor role will be granted to the Managed Identity for the Stream Analytics job. To learn more about Managed Identity for ADLS Gen2, see [Storage Blob Managed Identity](blob-output-managed-identity.md). Managed identities eliminate the limitations of user-based authentication methods, like the need to reauthenticate because of password changes or user token expirations that occur every 90 days.
217
219
@@ -224,13 +226,13 @@ Azure Stream Analytics jobs can output to a dedicated SQL pool table in Azure Sy
224
226
> [!IMPORTANT]
225
227
> The dedicated SQL pool table must exist before you can add it as output to your Stream Analytics job. The table's schema must match the fields and their types in your job's output.
226
228
227
-
Select **Synapse** as output for your Stream Analytics job and select the SQL pool table where you want to send the output of the job. For more information about Synapse output for a Stream Analytics job, see [Azure Synapse Analytics output from Azure Stream Analytics](azure-synapse-analytics-output.md).
229
+
Select **Synapse**under **Outputs** section on the ribbon as output for your Stream Analytics job and select the SQL pool table where you want to send the output of the job. For more information about Synapse output for a Stream Analytics job, see [Azure Synapse Analytics output from Azure Stream Analytics](azure-synapse-analytics-output.md).
228
230
229
231
### Azure Cosmos DB
230
232
231
233
Azure Cosmos DB is a globally distributed database service that offers limitless elastic scale around the globe, rich query, and automatic indexing over schema-agnostic data models.
232
234
233
-
Select **CosmosDB** as output for your Stream Analytics job. For more information about Cosmos DB output for a Stream Analytics job, see [Azure Cosmos DB output from Azure Stream Analytics](azure-cosmos-db-output.md).
235
+
Select **CosmosDB**under **Outputs** section on the ribbon as output for your Stream Analytics job. For more information about Cosmos DB output for a Stream Analytics job, see [Azure Cosmos DB output from Azure Stream Analytics](azure-cosmos-db-output.md).
234
236
235
237
When connecting to Azure Cosmos DB, if you choose ‘Managed Identity’ as Authentication mode, then the Contributor role will be granted to the Managed Identity for the Stream Analytics job.To learn more about Managed Identity for Cosmos DB, see [Cosmos DB Managed Identity](cosmos-db-managed-identity.md). Managed identities eliminate the limitations of user-based authentication methods, like the need to reauthenticate because of password changes or user token expirations that occur every 90 days.
236
238
@@ -244,9 +246,11 @@ When connecting to Azure Cosmos DB, if you choose ‘Managed Identity’ as Auth
244
246
> [!IMPORTANT]
245
247
> The Azure SQL database table must exist before you can add it as output to your Stream Analytics job. The table's schema must match the fields and their types in your job's output.
246
248
247
-
Select **SQL Database** as output for your Stream Analytics job. For more information about Azure SQL database output for a Stream Analytics job, see [Azure SQL Database output from Azure Stream Analytics](./sql-database-output.md).
249
+
To configure SQL database as output, simply select the **SQL Database** under **Outputs** section on the editor ribbon. Then fill in the needed information to connect your SQL database and select the table you want to write data to.
250
+
251
+
For more information about Azure SQL database output for a Stream Analytics job, see [Azure SQL Database output from Azure Stream Analytics](./sql-database-output.md).
248
252
249
-
## Data preview, runtime logs and metrics
253
+
## Data preview, authoring errors, runtime logs, and metrics
250
254
251
255
The no code drag-and-drop experience provides tools to help you author, troubleshoot, and evaluate the performance of your analytics pipeline for streaming data.
252
256
@@ -305,13 +309,13 @@ You can save the job anytime while creating it. Once you have configured the eve
305
309
306
310
:::image type="content" source="./media/no-code-stream-processing/no-code-save-start.png" alt-text="Screenshot showing the Save and Start options." lightbox="./media/no-code-stream-processing/no-code-save-start.png" :::
307
311
308
-
- Output start time - When you start a job, you select a time for the job to start creating output.
309
-
- Now - Makes the starting point of the output event stream the same as when the job is started.
310
-
- Custom - You can choose the starting point of the output.
311
-
- When last stopped - This option is available when the job was previously started but was stopped manually or failed. When you choose this option, the last output time will be used to restart the job, so no data is lost.
312
-
- Streaming units - Streaming Units represent the amount of compute and memory assigned to the job while running. If you're unsure how many SUs to choose, we recommend that you start with three and adjust as needed.
313
-
- Output data error handling – Output data error handling policies only apply when the output event produced by a Stream Analytics job doesn't conform to the schema of the target sink. You can configure the policy by choosing either **Retry** or **Drop**. For more information, see [Azure Stream Analytics output error policy](stream-analytics-output-error-policy.md).
314
-
- Start – Starts the Stream Analytics job.
312
+
-**Output start time** - When you start a job, you select a time for the job to start creating output.
313
+
-**Now** - Makes the starting point of the output event stream the same as when the job is started.
314
+
-**Custom** - You can choose the starting point of the output.
315
+
-**When last stopped** - This option is available when the job was previously started but was stopped manually or failed. When you choose this option, the last output time will be used to restart the job, so no data is lost.
316
+
-**Streaming units** - Streaming Units represent the amount of compute and memory assigned to the job while running. If you're unsure how many SUs to choose, we recommend that you start with three and adjust as needed.
317
+
-**Output data error handling** - Output data error handling policies only apply when the output event produced by a Stream Analytics job doesn't conform to the schema of the target sink. You can configure the policy by choosing either **Retry** or **Drop**. For more information, see [Azure Stream Analytics output error policy](stream-analytics-output-error-policy.md).
318
+
-**Start** - Starts the Stream Analytics job.
315
319
316
320
:::image type="content" source="./media/no-code-stream-processing/start-job.png" alt-text="Screenshot showing the Start Stream Analytics job window where you review the job configuration and start the job." lightbox="./media/no-code-stream-processing/start-job.png" :::
317
321
@@ -321,14 +325,14 @@ You can see the list of all Stream Analytics jobs created by no-code drag and dr
321
325
322
326
:::image type="content" source="./media/no-code-stream-processing/jobs-list.png" alt-text="Screenshot showing the Stream Analytics job list where you review job status." lightbox="./media/no-code-stream-processing/jobs-list.png" :::
323
327
324
-
- Filter – You can filter the list by job name.
325
-
- Refresh – The list doesn't auto-refresh currently. Use the option to refresh the list and see the latest status.
326
-
- Job name – The name you provided in the first step of job creation. You can't edit it. Select the job name to open the job in the no-code drag and drop experience where you can Stop the job, edit it, and Start it again.
327
-
- Status – The status of the job. Select Refresh on top of the list to see the latest status.
328
-
- Streaming units – The number of Streaming units selected when you started the job.
329
-
- Output watermark - An indicator of liveliness for the data produced by the job. All events before the timestamp are already computed.
330
-
- Job monitoring – Select **Open metrics** to see the metrics related to this Stream Analytics job. For more information about the metrics you can use to monitor your Stream Analytics job, see [Azure Stream Analytics job metrics](./stream-analytics-job-metrics.md).
331
-
- Operations – Start, stop, or delete the job.
328
+
-**Filter** - You can filter the list by job name.
329
+
-**Refresh** - The list doesn't auto-refresh currently. Use the option to refresh the list and see the latest status.
330
+
-**Job name** - The name you provided in the first step of job creation. You can't edit it. Select the job name to open the job in the no-code drag and drop experience where you can Stop the job, edit it, and Start it again.
331
+
-**Status** - The status of the job. Select Refresh on top of the list to see the latest status.
332
+
-**Streaming units** - The number of Streaming units selected when you started the job.
333
+
-**Output watermark** - An indicator of liveliness for the data produced by the job. All events before the timestamp are already computed.
334
+
-**Job monitoring** - Select **Open metrics** to see the metrics related to this Stream Analytics job. For more information about the metrics you can use to monitor your Stream Analytics job, see [Azure Stream Analytics job metrics](./stream-analytics-job-metrics.md).
0 commit comments