You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/stream-analytics/job-states.md
+6-4Lines changed: 6 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -5,17 +5,19 @@ author: xujxu
5
5
ms.author: xujiang1
6
6
ms.service: stream-analytics
7
7
ms.topic: conceptual
8
-
ms.date: 06/21/2019
8
+
ms.date: 12/21/2022
9
9
---
10
10
# Azure Stream Analytics job states
11
11
12
12
A Stream Analytics job could be in one of four states at any given time: running, stopped, degraded, or failed. You can find the state of your job on your Stream Analytics job's Overview page in the Azure portal.
13
13
14
+
:::image type="content" source="./media/job-states/job-state.png" alt-text="Screenshot that shows job state." lightbox="./media/job-states/job-state.png":::
15
+
14
16
| State | Description | Recommended actions |
15
17
| --- | --- | --- |
16
-
|**Running**| Your job is running on Azure reading events coming from the defined input sources, processing them and writing the results to the configured output sinks. | It is a best practice to track your job’s performance by monitoring [key metrics](./stream-analytics-job-metrics.md#scenarios-to-monitor). |
17
-
|**Stopped**| Your job is stopped and does not process events. | NA |
18
-
| **Degraded** | There might be intermittent issues with your input and output connections. These errors are called transient errors which might make your job enter a Degraded state. Stream Analytics will immediately try to recover from such errors and return to a Running state (within few minutes). These errors could happen due to network issues, availability of other Azure resources, deserialization errors etc. Your job’s performance may be impacted when job is in degraded state.| You can look at the [diagnostic or activity logs](./stream-analytics-job-diagnostic-logs.md#debugging-using-activity-logs) to learn more about the cause of these transient errors. In cases such as deserialization errors, it is recommended to take corrective action to ensure events aren't malformed. If the job keeps reaching the resource utilization limit, try to increase the SU number or [parallelize your job](./stream-analytics-parallelization.md). In other cases where you cannot take any action, Stream Analytics will try to recover to a *Running* state. <br> You can use [watermark delay](./stream-analytics-job-metrics.md#scenarios-to-monitor) metric to understand if these transient errors are impacting your job's performance.|
18
+
|**Running**| Your job is running on Azure reading events coming from the defined input sources, processing them and writing the results to the configured output sinks. | It's a best practice to track your job’s performance by monitoring [key metrics](./stream-analytics-job-metrics.md#scenarios-to-monitor). |
19
+
|**Stopped**| Your job is stopped and doesn't process events. | NA |
20
+
| **Degraded** | There might be intermittent issues with your input and output connections. These errors are called transient errors that might make your job enter a Degraded state. Stream Analytics will immediately try to recover from such errors and return to a Running state (within few minutes). These errors could happen due to network issues, availability of other Azure resources, deserialization errors etc. Your job’s performance may be impacted when job is in degraded state.| You can look at the [diagnostic or activity logs](./stream-analytics-job-diagnostic-logs.md#debugging-using-activity-logs) to learn more about the cause of these transient errors. In cases such as deserialization errors, it's recommended to take corrective action to ensure events aren't malformed. If the job keeps reaching the resource utilization limit, try to increase the SU number or [parallelize your job](./stream-analytics-parallelization.md). In other cases where you can't take any action, Stream Analytics will try to recover to a *Running* state. <br> You can use [watermark delay](./stream-analytics-job-metrics.md#scenarios-to-monitor) metric to understand if these transient errors are impacting your job's performance.|
19
21
|**Failed**| Your job encountered a critical error resulting in a failed state. Events aren't read and processed. Runtime errors are a common cause for jobs ending up in a failed state. | You can [configure alerts](./stream-analytics-set-up-alerts.md#set-up-alerts-in-the-azure-portal) so that you get notified when job goes to Failed state. <br> <br>You can debug using [activity and resource logs](./stream-analytics-job-diagnostic-logs.md#debugging-using-activity-logs) to identify root cause and address the issue.|
Copy file name to clipboardExpand all lines: articles/stream-analytics/manage-jobs-cluster.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,7 +6,7 @@ ms.author: xujiang1
6
6
ms.service: stream-analytics
7
7
ms.topic: overview
8
8
ms.custom: mvc
9
-
ms.date: 11/29/2021
9
+
ms.date: 12/21/2022
10
10
---
11
11
12
12
# Add and Remove jobs in an Azure Stream Analytics cluster
@@ -25,7 +25,7 @@ Only existing Stream Analytics jobs can be added to clusters. Follow the quickst
25
25
26
26

27
27
28
-
1. After you have added the job to the cluster, navigate to the job resource and [start the job](start-job.md#azure-portal). The job will then start to run on your cluster.
28
+
1. After you've added the job to the cluster, navigate to the job resource and [start the job](start-job.md#azure-portal). The job will then start to run on your cluster.
29
29
30
30
You can do all other operations, such as monitoring, alerting, and diagnostic logs, from the Stream Analytics job resource page.
Copy file name to clipboardExpand all lines: articles/stream-analytics/repartition.md
+5-5Lines changed: 5 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,10 +1,10 @@
1
1
---
2
2
title: Use repartitioning to optimize Azure Stream Analytics jobs
3
-
description: This article describes how to use repartitioning to optimize Azure Stream Analytics jobs that cannot be parallelized.
3
+
description: This article describes how to use repartitioning to optimize Azure Stream Analytics jobs that can't be parallelized.
4
4
ms.service: stream-analytics
5
5
author: xujxu
6
6
ms.author: xujiang1
7
-
ms.date: 03/04/2021
7
+
ms.date: 12/21/2022
8
8
ms.topic: conceptual
9
9
ms.custom: mvc
10
10
---
@@ -21,12 +21,12 @@ You might not be able to use [parallelization](stream-analytics-parallelization.
21
21
Repartitioning, or reshuffling, is required when you process data on a stream that's not sharded according to a natural input scheme, such as **PartitionId** for Event Hubs. When you repartition, each shard can be processed independently, which allows you to linearly scale out your streaming pipeline.
22
22
23
23
## How to repartition
24
-
You can repartition your input in 2 ways:
24
+
You can repartition your input in two ways:
25
25
1. Use a separate Stream Analytics job that does the repartitioning
26
26
2. Use a single job but do the repartitioning first before your custom analytics logic
27
27
28
28
### Creating a separate Stream Analytics job to repartition input
29
-
You can create a job that reads input and writes to an Event Hub output using a partition key. This Event Hub can then serve as input for another Stream Analytics job where you implement your analytics logic. When configuring this Event Hub output in your job, you must specify the partition key by which Stream Analytics will repartition your data.
29
+
You can create a job that reads input and writes to an event hub output using a partition key. This event hub can then serve as input for another Stream Analytics job where you implement your analytics logic. When configuring this event hub output in your job, you must specify the partition key by which Stream Analytics will repartition your data.
30
30
```sql
31
31
-- For compat level 1.2 or higher
32
32
SELECT*
@@ -73,7 +73,7 @@ Experiment and observe the resource usage of your job to determine the exact num
73
73
74
74
When your job uses SQL database for output, use explicit repartitioning to match the optimal partition count to maximize throughput. Since SQL works best with eight writers, repartitioning the flow to eight before flushing, or somewhere further upstream, may benefit job performance.
75
75
76
-
When there are more than 8 input partitions, inheriting the input partitioning scheme might not be an appropriate choice. Consider using [INTO](/stream-analytics-query/into-azure-stream-analytics#into-shard-count) in your query to explicitly specify the number of output writers.
76
+
When there are more than eight input partitions, inheriting the input partitioning scheme might not be an appropriate choice. Consider using [INTO](/stream-analytics-query/into-azure-stream-analytics#into-shard-count) in your query to explicitly specify the number of output writers.
77
77
78
78
The following example reads from the input, regardless of it being naturally partitioned, and repartitions the stream tenfold according to the DeviceID dimension and flushes the data to output.
@@ -19,15 +19,15 @@ You can redeploy an Azure Stream Analytics job by exporting the Azure Resource M
19
19
20
20
Before you can export a template, you must first open an existing Stream Analytics job in Visual Studio Code.
21
21
22
-
To export a job to a local project, locate the job you wish to export in the **Stream Analytics Explorer** in the Azure portal. From the **Query** page, select **Open in Visual Studio**. Then select **Visual Studio Code**.
22
+
To export a job to a local project, locate the job you want to export in the **Stream Analytics Explorer** in the Azure portal. From the **Query** page, select **Open in VS Code**. And then select **Open job in Visual Studio Code**.
23
23
24
24

25
25
26
26
For more information on using Visual Studio Code to manage Stream Analytics jobs, see the [Visual Studio Code quickstart](quick-create-visual-studio-code.md).
27
27
28
28
## Compile the script
29
29
30
-
The next step is to compile the job script to an Azure Resource Manager template. Before you compile the script, ensure that your job has at least one input and one output configured. If no input or output is configured, you need to configure the input and output first.
30
+
The next step is to compile the job script to an Azure Resource Manager template. Before you compile the script, ensure that your job has at least one input, and one output configured. If no input or output is configured, you need to configure the input and output first.
31
31
32
32
1. In Visual Studio Code, navigate to your job's *Transformation.asaql* file.
You're ready to deploy your Azure Stream Analytics job using the Azure Resource Manager templates you generated in the previous section.
55
55
56
-
In a PowerShell window, run the following command. Be sure to reaplce the *ResourceGroupName*, *TemplateFile*, and *TemplateParameterFile* with your actual resource group name, and the complete file paths to the *JobTemplate.json* and *JobTemplate.parameters.json* files in the **Deploy Folder** of your job workspace.
56
+
In a PowerShell window, run the following command. Be sure to replace the *ResourceGroupName*, *TemplateFile*, and *TemplateParameterFile* with your actual resource group name, and the complete file paths to the *JobTemplate.json* and *JobTemplate.parameters.json* files in the **Deploy Folder** of your job workspace.
57
57
58
58
If you don't have Azure PowerShell configured, follow the steps in [Install Azure PowerShell module](/powershell/azure/install-Az-ps).
Copy file name to clipboardExpand all lines: articles/stream-analytics/stream-analytics-monitoring.md
+4-4Lines changed: 4 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -5,7 +5,7 @@ author: xujxu
5
5
ms.author: xujiang1
6
6
ms.service: stream-analytics
7
7
ms.topic: how-to
8
-
ms.date: 03/08/2021
8
+
ms.date: 12/21/2022
9
9
ms.custom: seodec18
10
10
---
11
11
# Monitor Stream Analytics job with Azure portal
@@ -17,7 +17,7 @@ To see Azure Stream Analytics job metrics, browse to the Stream Analytics job yo
17
17
18
18
:::image type="content" source="./media/stream-analytics-monitoring/02-stream-analytics-monitoring-block.png" alt-text="Diagram that shows the Stream Analytics job monitoring section." lightbox="./media/stream-analytics-monitoring/02-stream-analytics-monitoring-block.png":::
19
19
20
-
Alternatively, browse to the **Monitoring** blade in the left panel and click the **Metrics**, then the metric page will be shown for adding the specific metric you'd like to check:
20
+
Alternatively, browse to the **Monitoring** blade in the left panel and select the **Metrics**, then the metric page will be shown for adding the specific metric you'd like to check:
@@ -29,15 +29,15 @@ You can also use these metrics to [monitor the performance of your Stream Analyt
29
29
30
30
There are several options available for you to operate and aggregate the metrics in portal monitor page.
31
31
32
-
To check the metrics data for a specific dimension, you can use **Add filter**. There are 3 important metrics dimensions available. To learn more about the metric dimensions, see [Azure Stream Analytics metrics dimensions](./stream-analytics-job-metrics-dimensions.md).
32
+
To check the metrics data for a specific dimension, you can use **Add filter**. There are three important metrics dimensions available. To learn more about the metric dimensions, see [Azure Stream Analytics metrics dimensions](./stream-analytics-job-metrics-dimensions.md).
You can also specify the time range to view the metrics you are interested in.
40
+
You can also specify the time range to view the metrics you're interested in.
41
41
42
42
:::image type="content" source="./media/stream-analytics-monitoring/08-stream-analytics-monitoring.png" alt-text="Diagram that shows the Stream Analytics monitor page with time range." lightbox="./media/stream-analytics-monitoring/08-stream-analytics-monitoring.png":::
0 commit comments