Skip to content

Commit aa81b91

Browse files
authored
Merge pull request #222088 from xujxu/update-doc-date-for-freshness
update doc freshness
2 parents d2f28ef + 505f26c commit aa81b91

File tree

7 files changed

+21
-19
lines changed

7 files changed

+21
-19
lines changed

articles/stream-analytics/job-states.md

Lines changed: 6 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -5,17 +5,19 @@ author: xujxu
55
ms.author: xujiang1
66
ms.service: stream-analytics
77
ms.topic: conceptual
8-
ms.date: 06/21/2019
8+
ms.date: 12/21/2022
99
---
1010
# Azure Stream Analytics job states
1111

1212
A Stream Analytics job could be in one of four states at any given time: running, stopped, degraded, or failed. You can find the state of your job on your Stream Analytics job's Overview page in the Azure portal.
1313

14+
:::image type="content" source="./media/job-states/job-state.png" alt-text="Screenshot that shows job state." lightbox="./media/job-states/job-state.png":::
15+
1416
| State | Description | Recommended actions |
1517
| --- | --- | --- |
16-
| **Running** | Your job is running on Azure reading events coming from the defined input sources, processing them and writing the results to the configured output sinks. | It is a best practice to track your job’s performance by monitoring [key metrics](./stream-analytics-job-metrics.md#scenarios-to-monitor). |
17-
| **Stopped** | Your job is stopped and does not process events. | NA |
18-
| **Degraded** | There might be intermittent issues with your input and output connections. These errors are called transient errors which might make your job enter a Degraded state. Stream Analytics will immediately try to recover from such errors and return to a Running state (within few minutes). These errors could happen due to network issues, availability of other Azure resources, deserialization errors etc. Your job’s performance may be impacted when job is in degraded state.| You can look at the [diagnostic or activity logs](./stream-analytics-job-diagnostic-logs.md#debugging-using-activity-logs) to learn more about the cause of these transient errors. In cases such as deserialization errors, it is recommended to take corrective action to ensure events aren't malformed. If the job keeps reaching the resource utilization limit, try to increase the SU number or [parallelize your job](./stream-analytics-parallelization.md). In other cases where you cannot take any action, Stream Analytics will try to recover to a *Running* state. <br> You can use [watermark delay](./stream-analytics-job-metrics.md#scenarios-to-monitor) metric to understand if these transient errors are impacting your job's performance.|
18+
| **Running** | Your job is running on Azure reading events coming from the defined input sources, processing them and writing the results to the configured output sinks. | It's a best practice to track your job’s performance by monitoring [key metrics](./stream-analytics-job-metrics.md#scenarios-to-monitor). |
19+
| **Stopped** | Your job is stopped and doesn't process events. | NA |
20+
| **Degraded** | There might be intermittent issues with your input and output connections. These errors are called transient errors that might make your job enter a Degraded state. Stream Analytics will immediately try to recover from such errors and return to a Running state (within few minutes). These errors could happen due to network issues, availability of other Azure resources, deserialization errors etc. Your job’s performance may be impacted when job is in degraded state.| You can look at the [diagnostic or activity logs](./stream-analytics-job-diagnostic-logs.md#debugging-using-activity-logs) to learn more about the cause of these transient errors. In cases such as deserialization errors, it's recommended to take corrective action to ensure events aren't malformed. If the job keeps reaching the resource utilization limit, try to increase the SU number or [parallelize your job](./stream-analytics-parallelization.md). In other cases where you can't take any action, Stream Analytics will try to recover to a *Running* state. <br> You can use [watermark delay](./stream-analytics-job-metrics.md#scenarios-to-monitor) metric to understand if these transient errors are impacting your job's performance.|
1921
| **Failed** | Your job encountered a critical error resulting in a failed state. Events aren't read and processed. Runtime errors are a common cause for jobs ending up in a failed state. | You can [configure alerts](./stream-analytics-set-up-alerts.md#set-up-alerts-in-the-azure-portal) so that you get notified when job goes to Failed state. <br> <br>You can debug using [activity and resource logs](./stream-analytics-job-diagnostic-logs.md#debugging-using-activity-logs) to identify root cause and address the issue.|
2022

2123
## Next steps

articles/stream-analytics/manage-jobs-cluster.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ ms.author: xujiang1
66
ms.service: stream-analytics
77
ms.topic: overview
88
ms.custom: mvc
9-
ms.date: 11/29/2021
9+
ms.date: 12/21/2022
1010
---
1111

1212
# Add and Remove jobs in an Azure Stream Analytics cluster
@@ -25,7 +25,7 @@ Only existing Stream Analytics jobs can be added to clusters. Follow the quickst
2525

2626
![Add job to cluster](./media/manage-jobs-cluster/add-job.png)
2727

28-
1. After you have added the job to the cluster, navigate to the job resource and [start the job](start-job.md#azure-portal). The job will then start to run on your cluster.
28+
1. After you've added the job to the cluster, navigate to the job resource and [start the job](start-job.md#azure-portal). The job will then start to run on your cluster.
2929

3030
You can do all other operations, such as monitoring, alerting, and diagnostic logs, from the Stream Analytics job resource page.
3131

34 KB
Loading
-96.9 KB
Loading

articles/stream-analytics/repartition.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,10 @@
11
---
22
title: Use repartitioning to optimize Azure Stream Analytics jobs
3-
description: This article describes how to use repartitioning to optimize Azure Stream Analytics jobs that cannot be parallelized.
3+
description: This article describes how to use repartitioning to optimize Azure Stream Analytics jobs that can't be parallelized.
44
ms.service: stream-analytics
55
author: xujxu
66
ms.author: xujiang1
7-
ms.date: 03/04/2021
7+
ms.date: 12/21/2022
88
ms.topic: conceptual
99
ms.custom: mvc
1010
---
@@ -21,12 +21,12 @@ You might not be able to use [parallelization](stream-analytics-parallelization.
2121
Repartitioning, or reshuffling, is required when you process data on a stream that's not sharded according to a natural input scheme, such as **PartitionId** for Event Hubs. When you repartition, each shard can be processed independently, which allows you to linearly scale out your streaming pipeline.
2222

2323
## How to repartition
24-
You can repartition your input in 2 ways:
24+
You can repartition your input in two ways:
2525
1. Use a separate Stream Analytics job that does the repartitioning
2626
2. Use a single job but do the repartitioning first before your custom analytics logic
2727

2828
### Creating a separate Stream Analytics job to repartition input
29-
You can create a job that reads input and writes to an Event Hub output using a partition key. This Event Hub can then serve as input for another Stream Analytics job where you implement your analytics logic. When configuring this Event Hub output in your job, you must specify the partition key by which Stream Analytics will repartition your data.
29+
You can create a job that reads input and writes to an event hub output using a partition key. This event hub can then serve as input for another Stream Analytics job where you implement your analytics logic. When configuring this event hub output in your job, you must specify the partition key by which Stream Analytics will repartition your data.
3030
```sql
3131
-- For compat level 1.2 or higher
3232
SELECT *
@@ -73,7 +73,7 @@ Experiment and observe the resource usage of your job to determine the exact num
7373

7474
When your job uses SQL database for output, use explicit repartitioning to match the optimal partition count to maximize throughput. Since SQL works best with eight writers, repartitioning the flow to eight before flushing, or somewhere further upstream, may benefit job performance.
7575

76-
When there are more than 8 input partitions, inheriting the input partitioning scheme might not be an appropriate choice. Consider using [INTO](/stream-analytics-query/into-azure-stream-analytics#into-shard-count) in your query to explicitly specify the number of output writers.
76+
When there are more than eight input partitions, inheriting the input partitioning scheme might not be an appropriate choice. Consider using [INTO](/stream-analytics-query/into-azure-stream-analytics#into-shard-count) in your query to explicitly specify the number of output writers.
7777

7878
The following example reads from the input, regardless of it being naturally partitioned, and repartitions the stream tenfold according to the DeviceID dimension and flushes the data to output.
7979

articles/stream-analytics/resource-manager-export.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ author: xujxu
66
ms.author: xujiang1
77
ms.service: stream-analytics
88
ms.topic: how-to
9-
ms.date: 03/10/2020
9+
ms.date: 12/12/2022
1010
---
1111

1212
# Export an Azure Stream Analytics job Azure Resource Manager template
@@ -19,15 +19,15 @@ You can redeploy an Azure Stream Analytics job by exporting the Azure Resource M
1919

2020
Before you can export a template, you must first open an existing Stream Analytics job in Visual Studio Code.
2121

22-
To export a job to a local project, locate the job you wish to export in the **Stream Analytics Explorer** in the Azure portal. From the **Query** page, select **Open in Visual Studio**. Then select **Visual Studio Code**.
22+
To export a job to a local project, locate the job you want to export in the **Stream Analytics Explorer** in the Azure portal. From the **Query** page, select **Open in VS Code**. And then select **Open job in Visual Studio Code**.
2323

2424
![Open Stream Analytics job in Visual Studio Code](./media/resource-manager-export/open-job-vs-code.png)
2525

2626
For more information on using Visual Studio Code to manage Stream Analytics jobs, see the [Visual Studio Code quickstart](quick-create-visual-studio-code.md).
2727

2828
## Compile the script
2929

30-
The next step is to compile the job script to an Azure Resource Manager template. Before you compile the script, ensure that your job has at least one input and one output configured. If no input or output is configured, you need to configure the input and output first.
30+
The next step is to compile the job script to an Azure Resource Manager template. Before you compile the script, ensure that your job has at least one input, and one output configured. If no input or output is configured, you need to configure the input and output first.
3131

3232
1. In Visual Studio Code, navigate to your job's *Transformation.asaql* file.
3333

@@ -53,7 +53,7 @@ Next, complete the Azure Resource Manager template parameters file.
5353

5454
You're ready to deploy your Azure Stream Analytics job using the Azure Resource Manager templates you generated in the previous section.
5555

56-
In a PowerShell window, run the following command. Be sure to reaplce the *ResourceGroupName*, *TemplateFile*, and *TemplateParameterFile* with your actual resource group name, and the complete file paths to the *JobTemplate.json* and *JobTemplate.parameters.json* files in the **Deploy Folder** of your job workspace.
56+
In a PowerShell window, run the following command. Be sure to replace the *ResourceGroupName*, *TemplateFile*, and *TemplateParameterFile* with your actual resource group name, and the complete file paths to the *JobTemplate.json* and *JobTemplate.parameters.json* files in the **Deploy Folder** of your job workspace.
5757

5858
If you don't have Azure PowerShell configured, follow the steps in [Install Azure PowerShell module](/powershell/azure/install-Az-ps).
5959

articles/stream-analytics/stream-analytics-monitoring.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ author: xujxu
55
ms.author: xujiang1
66
ms.service: stream-analytics
77
ms.topic: how-to
8-
ms.date: 03/08/2021
8+
ms.date: 12/21/2022
99
ms.custom: seodec18
1010
---
1111
# Monitor Stream Analytics job with Azure portal
@@ -17,7 +17,7 @@ To see Azure Stream Analytics job metrics, browse to the Stream Analytics job yo
1717

1818
:::image type="content" source="./media/stream-analytics-monitoring/02-stream-analytics-monitoring-block.png" alt-text="Diagram that shows the Stream Analytics job monitoring section." lightbox="./media/stream-analytics-monitoring/02-stream-analytics-monitoring-block.png":::
1919

20-
Alternatively, browse to the **Monitoring** blade in the left panel and click the **Metrics**, then the metric page will be shown for adding the specific metric you'd like to check:
20+
Alternatively, browse to the **Monitoring** blade in the left panel and select the **Metrics**, then the metric page will be shown for adding the specific metric you'd like to check:
2121

2222
:::image type="content" source="./media/stream-analytics-monitoring/01-stream-analytics-monitoring.png" alt-text="Diagram that shows Stream Analytics job monitoring dashboard." lightbox="./media/stream-analytics-monitoring/01-stream-analytics-monitoring.png":::
2323

@@ -29,15 +29,15 @@ You can also use these metrics to [monitor the performance of your Stream Analyt
2929

3030
There are several options available for you to operate and aggregate the metrics in portal monitor page.
3131

32-
To check the metrics data for a specific dimension, you can use **Add filter**. There are 3 important metrics dimensions available. To learn more about the metric dimensions, see [Azure Stream Analytics metrics dimensions](./stream-analytics-job-metrics-dimensions.md).
32+
To check the metrics data for a specific dimension, you can use **Add filter**. There are three important metrics dimensions available. To learn more about the metric dimensions, see [Azure Stream Analytics metrics dimensions](./stream-analytics-job-metrics-dimensions.md).
3333

3434
:::image type="content" source="./media/stream-analytics-monitoring/03-stream-analytics-monitoring-filter.png" alt-text="Diagram that shows Stream Analytics job metrics filter." lightbox="./media/stream-analytics-monitoring/03-stream-analytics-monitoring-filter.png":::
3535

3636
To check the metrics data per dimension, you can use **Apply splitting**.
3737

3838
:::image type="content" source="./media/stream-analytics-monitoring/04-stream-analytics-monitoring-splitter.png" alt-text="Diagram that shows Stream Analytics job metrics splitter." lightbox="./media/stream-analytics-monitoring/04-stream-analytics-monitoring-splitter.png":::
3939

40-
You can also specify the time range to view the metrics you are interested in.
40+
You can also specify the time range to view the metrics you're interested in.
4141

4242
:::image type="content" source="./media/stream-analytics-monitoring/08-stream-analytics-monitoring.png" alt-text="Diagram that shows the Stream Analytics monitor page with time range." lightbox="./media/stream-analytics-monitoring/08-stream-analytics-monitoring.png":::
4343

0 commit comments

Comments
 (0)