Skip to content

Commit 6e814ec

Browse files
author
Xu Jiang
committed
update a few places with screenshots
1 parent 61dd027 commit 6e814ec

File tree

4 files changed

+34
-30
lines changed

4 files changed

+34
-30
lines changed

articles/stream-analytics/event-ordering.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ ms.author: sidram
66

77
ms.service: stream-analytics
88
ms.topic: how-to
9-
ms.date: 08/06/2020
9+
ms.date: 08/26/2022
1010
---
1111
# Configuring event ordering policies for Azure Stream Analytics
1212

@@ -32,15 +32,15 @@ If events arrive late or out-of-order based on the policies you've configured, y
3232

3333
Let us see an example of these policies in action.
3434
<br> **Late arrival policy:** 15 seconds
35-
<br> **Out-of-order policy:** 8 seconds
35+
<br> **Out-of-order policy:** 5 seconds
3636

3737
| Event No. | Event Time | Arrival Time | System.Timestamp | Explanation |
3838
| --- | --- | --- | --- | --- |
3939
| **1** | 00:10:00 | 00:10:40 | 00:10:25 | Event arrived late and outside tolerance level. So event time gets adjusted to maximum late arrival tolerance. |
4040
| **2** | 00:10:30 | 00:10:41 | 00:10:30 | Event arrived late but within tolerance level. So event time doesn't get adjusted. |
4141
| **3** | 00:10:42 | 00:10:42 | 00:10:42 | Event arrived on time. No adjustment needed. |
42-
| **4** | 00:10:38 | 00:10:43 | 00:10:38 | Event arrived out-of-order but within the tolerance of 8 seconds. So, event time doesn't get adjusted. For analytics purposes, this event will be considered as preceding event number 4. |
43-
| **5** | 00:10:35 | 00:10:45 | 00:10:37 | Event arrived out-of-order and outside tolerance of 8 seconds. So, event time is adjusted to maximum of out-of-order tolerance. |
42+
| **4** | 00:10:38 | 00:10:43 | 00:10:38 | Event arrived out-of-order but within the tolerance of 5 seconds. So, event time doesn't get adjusted. For analytics purposes, this event will be considered as preceding event number 4 (with considering the total 5 events. The actual order is: 1, 2, 5, 4, 3). |
43+
| **5** | 00:10:35 | 00:10:45 | 00:10:37 | Event arrived out-of-order and outside tolerance of 5 seconds. So, event time is adjusted to maximum of out-of-order tolerance. |
4444

4545
## Can these settings delay output of my job?
4646

1.69 KB
Loading
21.7 KB
Loading

articles/stream-analytics/no-code-stream-processing.md

Lines changed: 30 additions & 26 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ ms.author: sidram
66
ms.service: stream-analytics
77
ms.topic: how-to
88
ms.custom: mvc, event-tier1-build-2022
9-
ms.date: 05/08/2022
9+
ms.date: 08/26/2022
1010
---
1111

1212
# No code stream processing using Azure Stream Analytics (Preview)
@@ -50,7 +50,7 @@ The following screenshot shows a finished Stream Analytics job. It highlights al
5050
1. **Ribbon** - On the ribbon, sections follow the order of a classic/ analytics process: Event Hubs as input (also known as data source), transformations (streaming ETL operations), outputs, a button to save your progress and a button to start the job.
5151
2. **Diagram view** - A graphical representation of your Stream Analytics job, from input to operations to outputs.
5252
3. **Side pane** - Depending on which component you selected in the diagram view, you'll have settings to modify input, transformation, or output.
53-
4. **Tabs for data preview, authoring errors, and runtime errors** - For each tile shown, the data preview will show you results for that step (live for inputs and on-demand for transformations and outputs). This section also summarizes any authoring errors or warnings that you might have in your job when it's being developed. Selecting each error or warning will select that transform.
53+
4. **Tabs for data preview, authoring errors, runtime logs, and metrics** - For each tile shown, the data preview will show you results for that step (live for inputs and on-demand for transformations and outputs). This section also summarizes any authoring errors or warnings that you might have in your job when it's being developed. Selecting each error or warning will select that transform. It also provides the job metrics for you to monitor running job's health.
5454

5555
## Event Hubs as the streaming input
5656

@@ -94,7 +94,7 @@ No-code editor now supports two reference data sources:
9494

9595
Reference data is modeled as a sequence of blobs in ascending order of the date/time specified in the blob name. Blobs can only be added to the end of the sequence by using a date/time greater than the one specified by the last blob in the sequence. Blobs are defined in the input configuration. For more information, see [Use reference data from Blob Storage for a Stream Analytics job](stream-analytics-use-reference-data.md).
9696

97-
First, you have to select your ADLS Gen2. To see details about each field, see Azure Blob Storage section in [Azure Blob Storage Reference data input](stream-analytics-use-reference-data#azure-blob-storage.md).
97+
First, you have to select **Reference ADLS Gen2** under **Inputs** section on the ribbon. To see details about each field, see Azure Blob Storage section in [Azure Blob Storage Reference data input](stream-analytics-use-reference-data#azure-blob-storage).
9898

9999
![Configure ADLS Gen2 as reference data input in no code editor](./media/no-code-stream-processing/blob-referencedata-nocode.png)
100100

@@ -104,9 +104,9 @@ Then, upload a JSON of array file and the fields in the file will be detected. U
104104

105105
### SQL Database as reference data
106106

107-
Azure Stream Analytics supports Azure SQL Database as a source of input for reference data as well. For more information, see [Azure SQL Database Reference data input](stream-analytics-use-reference-data#azure-sql-database.md). You can use SQL Database as reference data for your Stream Analytics job in the no-code editor.
107+
Azure Stream Analytics supports Azure SQL Database as a source of input for reference data as well. For more information, see [Azure SQL Database Reference data input](stream-analytics-use-reference-data#azure-sql-database). You can use SQL Database as reference data for your Stream Analytics job in the no-code editor.
108108

109-
To configure SQL database as reference data input, simply select the "**Reference SQL Database**" under "**Input**" section in no-code editor ribbon.
109+
To configure SQL database as reference data input, simply select the **Reference SQL Database** under **Inputs** section on the ribbon.
110110

111111
Then fill in the needed information to connect your reference database and select the table with your needed columns. You can also fetch the reference data from your table by editing the SQL query manually.
112112

@@ -116,7 +116,9 @@ Then fill in the needed information to connect your reference database and selec
116116

117117
Streaming data transformations are inherently different from batch data transformations. Almost all streaming data has a time component, which affects any data preparation tasks involved.
118118

119-
To add a streaming data transformation to your job, select the transformation symbol on the ribbon for that transformation. The respective tile will be dropped in the diagram view. After you select it, you'll see the side pane for that transformation to configure it.
119+
To add a streaming data transformation to your job, select the transformation symbol under **Operations** section on the ribbon for that transformation. The respective tile will be dropped in the diagram view. After you select it, you'll see the side pane for that transformation to configure it.
120+
121+
:::image type="content" source="./media/no-code-stream-processing/transformation-operations.png" alt-text="Screenshot showing the transformation operations." lightbox="./media/no-code-stream-processing/transformation-operations.png" :::
120122

121123
### Filter
122124

@@ -211,7 +213,7 @@ The no-code drag-and-drop experience currently supports several output sinks to
211213

212214
Data Lake Storage Gen2 makes Azure Storage the foundation for building enterprise data lakes on Azure. It's designed from the start to service multiple petabytes of information while sustaining hundreds of gigabits of throughput. It allows you to easily manage massive amounts of data. Azure Blob storage offers a cost-effective and scalable solution for storing large amounts of unstructured data in the cloud.
213215

214-
Select **ADLS Gen2** as output for your Stream Analytics job and select the container where you want to send the output of the job. For more information about Azure Data Lake Gen2 output for a Stream Analytics job, see [Blob storage and Azure Data Lake Gen2 output from Azure Stream Analytics](blob-storage-azure-data-lake-gen2-output.md).
216+
Select **ADLS Gen2** under **Outputs** section on the ribbon as output for your Stream Analytics job and select the container where you want to send the output of the job. For more information about Azure Data Lake Gen2 output for a Stream Analytics job, see [Blob storage and Azure Data Lake Gen2 output from Azure Stream Analytics](blob-storage-azure-data-lake-gen2-output.md).
215217

216218
When connecting to ADLS Gen2, if you choose ‘Managed Identity’ as Authentication mode, then the Storage Blob Data Contributor role will be granted to the Managed Identity for the Stream Analytics job. To learn more about Managed Identity for ADLS Gen2, see [Storage Blob Managed Identity](blob-output-managed-identity.md). Managed identities eliminate the limitations of user-based authentication methods, like the need to reauthenticate because of password changes or user token expirations that occur every 90 days.
217219

@@ -224,13 +226,13 @@ Azure Stream Analytics jobs can output to a dedicated SQL pool table in Azure Sy
224226
> [!IMPORTANT]
225227
> The dedicated SQL pool table must exist before you can add it as output to your Stream Analytics job. The table's schema must match the fields and their types in your job's output.
226228
227-
Select **Synapse** as output for your Stream Analytics job and select the SQL pool table where you want to send the output of the job. For more information about Synapse output for a Stream Analytics job, see [Azure Synapse Analytics output from Azure Stream Analytics](azure-synapse-analytics-output.md).
229+
Select **Synapse** under **Outputs** section on the ribbon as output for your Stream Analytics job and select the SQL pool table where you want to send the output of the job. For more information about Synapse output for a Stream Analytics job, see [Azure Synapse Analytics output from Azure Stream Analytics](azure-synapse-analytics-output.md).
228230

229231
### Azure Cosmos DB
230232

231233
Azure Cosmos DB is a globally distributed database service that offers limitless elastic scale around the globe, rich query, and automatic indexing over schema-agnostic data models.
232234

233-
Select **CosmosDB** as output for your Stream Analytics job. For more information about Cosmos DB output for a Stream Analytics job, see [Azure Cosmos DB output from Azure Stream Analytics](azure-cosmos-db-output.md).
235+
Select **CosmosDB** under **Outputs** section on the ribbon as output for your Stream Analytics job. For more information about Cosmos DB output for a Stream Analytics job, see [Azure Cosmos DB output from Azure Stream Analytics](azure-cosmos-db-output.md).
234236

235237
When connecting to Azure Cosmos DB, if you choose ‘Managed Identity’ as Authentication mode, then the Contributor role will be granted to the Managed Identity for the Stream Analytics job.To learn more about Managed Identity for Cosmos DB, see [Cosmos DB Managed Identity](cosmos-db-managed-identity.md). Managed identities eliminate the limitations of user-based authentication methods, like the need to reauthenticate because of password changes or user token expirations that occur every 90 days.
236238

@@ -244,9 +246,11 @@ When connecting to Azure Cosmos DB, if you choose ‘Managed Identity’ as Auth
244246
> [!IMPORTANT]
245247
> The Azure SQL database table must exist before you can add it as output to your Stream Analytics job. The table's schema must match the fields and their types in your job's output.
246248
247-
Select **SQL Database** as output for your Stream Analytics job. For more information about Azure SQL database output for a Stream Analytics job, see [Azure SQL Database output from Azure Stream Analytics](./sql-database-output.md).
249+
To configure SQL database as output, simply select the **SQL Database** under **Outputs** section on the editor ribbon. Then fill in the needed information to connect your SQL database and select the table you want to write data to.
250+
251+
For more information about Azure SQL database output for a Stream Analytics job, see [Azure SQL Database output from Azure Stream Analytics](./sql-database-output.md).
248252

249-
## Data preview, runtime logs and metrics
253+
## Data preview, authoring errors, runtime logs, and metrics
250254

251255
The no code drag-and-drop experience provides tools to help you author, troubleshoot, and evaluate the performance of your analytics pipeline for streaming data.
252256

@@ -305,13 +309,13 @@ You can save the job anytime while creating it. Once you have configured the eve
305309

306310
:::image type="content" source="./media/no-code-stream-processing/no-code-save-start.png" alt-text="Screenshot showing the Save and Start options." lightbox="./media/no-code-stream-processing/no-code-save-start.png" :::
307311

308-
- Output start time - When you start a job, you select a time for the job to start creating output.
309-
- Now - Makes the starting point of the output event stream the same as when the job is started.
310-
- Custom - You can choose the starting point of the output.
311-
- When last stopped - This option is available when the job was previously started but was stopped manually or failed. When you choose this option, the last output time will be used to restart the job, so no data is lost.
312-
- Streaming units - Streaming Units represent the amount of compute and memory assigned to the job while running. If you're unsure how many SUs to choose, we recommend that you start with three and adjust as needed.
313-
- Output data error handling Output data error handling policies only apply when the output event produced by a Stream Analytics job doesn't conform to the schema of the target sink. You can configure the policy by choosing either **Retry** or **Drop**. For more information, see [Azure Stream Analytics output error policy](stream-analytics-output-error-policy.md).
314-
- Start Starts the Stream Analytics job.
312+
- **Output start time** - When you start a job, you select a time for the job to start creating output.
313+
- **Now** - Makes the starting point of the output event stream the same as when the job is started.
314+
- **Custom** - You can choose the starting point of the output.
315+
- **When last stopped** - This option is available when the job was previously started but was stopped manually or failed. When you choose this option, the last output time will be used to restart the job, so no data is lost.
316+
- **Streaming units** - Streaming Units represent the amount of compute and memory assigned to the job while running. If you're unsure how many SUs to choose, we recommend that you start with three and adjust as needed.
317+
- **Output data error handling** - Output data error handling policies only apply when the output event produced by a Stream Analytics job doesn't conform to the schema of the target sink. You can configure the policy by choosing either **Retry** or **Drop**. For more information, see [Azure Stream Analytics output error policy](stream-analytics-output-error-policy.md).
318+
- **Start** - Starts the Stream Analytics job.
315319

316320
:::image type="content" source="./media/no-code-stream-processing/start-job.png" alt-text="Screenshot showing the Start Stream Analytics job window where you review the job configuration and start the job." lightbox="./media/no-code-stream-processing/start-job.png" :::
317321

@@ -321,14 +325,14 @@ You can see the list of all Stream Analytics jobs created by no-code drag and dr
321325

322326
:::image type="content" source="./media/no-code-stream-processing/jobs-list.png" alt-text="Screenshot showing the Stream Analytics job list where you review job status." lightbox="./media/no-code-stream-processing/jobs-list.png" :::
323327

324-
- Filter You can filter the list by job name.
325-
- Refresh The list doesn't auto-refresh currently. Use the option to refresh the list and see the latest status.
326-
- Job name The name you provided in the first step of job creation. You can't edit it. Select the job name to open the job in the no-code drag and drop experience where you can Stop the job, edit it, and Start it again.
327-
- Status The status of the job. Select Refresh on top of the list to see the latest status.
328-
- Streaming units The number of Streaming units selected when you started the job.
329-
- Output watermark - An indicator of liveliness for the data produced by the job. All events before the timestamp are already computed.
330-
- Job monitoring Select **Open metrics** to see the metrics related to this Stream Analytics job. For more information about the metrics you can use to monitor your Stream Analytics job, see [Azure Stream Analytics job metrics](./stream-analytics-job-metrics.md).
331-
- Operations Start, stop, or delete the job.
328+
- **Filter** - You can filter the list by job name.
329+
- **Refresh** - The list doesn't auto-refresh currently. Use the option to refresh the list and see the latest status.
330+
- **Job name** - The name you provided in the first step of job creation. You can't edit it. Select the job name to open the job in the no-code drag and drop experience where you can Stop the job, edit it, and Start it again.
331+
- **Status** - The status of the job. Select Refresh on top of the list to see the latest status.
332+
- **Streaming units** - The number of Streaming units selected when you started the job.
333+
- **Output watermark** - An indicator of liveliness for the data produced by the job. All events before the timestamp are already computed.
334+
- **Job monitoring** - Select **Open metrics** to see the metrics related to this Stream Analytics job. For more information about the metrics you can use to monitor your Stream Analytics job, see [Azure Stream Analytics job metrics](./stream-analytics-job-metrics.md).
335+
- **Operations** - Start, stop, or delete the job.
332336

333337
## Next steps
334338

0 commit comments

Comments
 (0)