You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/stream-analytics/automation-powershell.md
+11-11Lines changed: 11 additions & 11 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,13 +1,13 @@
1
1
---
2
-
title: Auto-pause an Azure Stream Analytics with PowerShell
2
+
title: Automatically pause an Azure Stream Analytics with PowerShell
3
3
description: This article describes how to automatically pause an Azure Stream Analytics job on a schedule by using PowerShell.
4
4
ms.service: stream-analytics
5
5
ms.custom: devx-track-azurepowershell
6
6
ms.topic: how-to
7
7
ms.date: 11/03/2021
8
8
---
9
9
10
-
# Auto-pause a job by using PowerShell and Azure Functions or Azure Automation
10
+
# Automatically pause a job by using PowerShell and Azure Functions or Azure Automation
11
11
12
12
Some applications require a stream processing approach (such as through [Azure Stream Analytics](./stream-analytics-introduction.md)) but don't strictly need to run continuously. The reasons are various:
13
13
@@ -18,18 +18,18 @@ Some applications require a stream processing approach (such as through [Azure S
18
18
19
19
The benefit of not running these jobs continuously is cost savings, because Stream Analytics jobs are [billed](https://azure.microsoft.com/pricing/details/stream-analytics/) per Streaming Unit over time.
20
20
21
-
This article explains how to set up auto-pause for an Azure Stream Analytics job. You configure a task that automatically pauses and resumes a job on a schedule. The term *pause* means that the job [state](./job-states.md) is **Stopped** to avoid any billing.
21
+
This article explains how to set up automatic pausing for an Azure Stream Analytics job. You configure a task that automatically pauses and resumes a job on a schedule. The term *pause* means that the job [state](./job-states.md) is **Stopped** to avoid any billing.
22
22
23
23
This article discusses the overall design, the required components, and some implementation details.
24
24
25
25
> [!NOTE]
26
-
> There are downsides to auto-pausing a job. The main downsides are the loss of low-latency/real-time capabilities and the potential risks from allowing the input event backlog to grow unsupervised while a job is paused. Organizations shouldn't consider auto-pausing for most production scenarios that run at scale.
26
+
> There are downsides to automatically pausing a job. The main downsides are the loss of low-latency/real-time capabilities and the potential risks from allowing the input event backlog to grow unsupervised while a job is paused. Organizations shouldn't consider automatic pausing for most production scenarios that run at scale.
27
27
28
28
## Design
29
29
30
30
For the example in this article, you want your job to run for *N* minutes before pausing it for *M* minutes. When the job is paused, the input data isn't consumed and accumulates upstream. After the job starts, it catches up with that backlog and processes the data trickling in before it's shut down again.
31
31
32
-

32
+

33
33
34
34
When the job is running, the task shouldn't stop the job until its metrics are healthy. The metrics of interest are the input backlog and the [watermark](./stream-analytics-time-handling.md#background-time-concepts). You'll check that both are at their baseline for at least *N* minutes. This behavior translates to two actions:
35
35
@@ -69,7 +69,7 @@ Another consideration is that metrics are always emitted. When the job is stoppe
69
69
70
70
### Scripting language
71
71
72
-
This article implements auto-pause in [PowerShell](/powershell/scripting/overview). The first reason for this choice is that PowerShell is now cross-platform. It can run on any operating system, which makes deployments easier. The second reason is that it takes and returns objects rather than strings. Objects make parsing and processing easier for automation tasks.
72
+
This article implements automatic pausing in [PowerShell](/powershell/scripting/overview). The first reason for this choice is that PowerShell is now cross-platform. It can run on any operating system, which makes deployments easier. The second reason is that it takes and returns objects rather than strings. Objects make parsing and processing easier for automation tasks.
73
73
74
74
In PowerShell, use the [Az PowerShell](/powershell/azure/new-azureps-module-az) module (which embarks [Az.Monitor](/powershell/module/az.monitor/) and [Az.StreamAnalytics](/powershell/module/az.streamanalytics/)) for everything you need:
75
75
@@ -243,7 +243,7 @@ The function needs permissions to start and stop the Stream Analytics job. You a
243
243
244
244
The first step is to enable a *system-assigned managed identity* for the function, by following [this procedure](../app-service/overview-managed-identity.md?tabs=ps%2cportal&toc=/azure/azure-functions/toc.json).
245
245
246
-
Now you can grant the right permissions to that identity on the Stream Analytics job that you want to auto-pause. For this task, in the portal area for the Stream Analytics job (not the function one), in **Access control (IAM)**, add a role assignment to the role **Contributor** for a member of type **Managed Identity**. Select the name of the function from earlier.
246
+
Now you can grant the right permissions to that identity on the Stream Analytics job that you want to automatically pause. For this task, in the portal area for the Stream Analytics job (not the function one), in **Access control (IAM)**, add a role assignment to the role **Contributor** for a member of type **Managed Identity**. Select the name of the function from earlier.
247
247
248
248

249
249
@@ -283,9 +283,9 @@ The first step is to follow the [procedure](../azure-functions/functions-how-to-
283
283
|`maxWatermark`|The amount of watermark that you tolerate when stopping the job. In seconds, `10` is a good starting point at low Streaming Units.|
284
284
|`restartThresholdMinute`|*M*: The time (in minutes) until a stopped job is restarted.|
285
285
|`stopThresholdMinute`|*N*: The time (in minutes) of cooldown until a running job is stopped. The input backlog needs to stay at `0` during that time.|
286
-
|`subscriptionId`|The subscription ID (not the name) of the Stream Analytics job to be auto-paused.|
287
-
|`resourceGroupName`|The resource group name of the Stream Analytics job to be auto-paused.|
288
-
|`asaJobName`|The name of the Stream Analytics job to be auto-paused.|
286
+
|`subscriptionId`|The subscription ID (not the name) of the Stream Analytics job to be automatically paused.|
287
+
|`resourceGroupName`|The resource group name of the Stream Analytics job to be automatically paused.|
288
+
|`asaJobName`|The name of the Stream Analytics job to be automatically paused.|
289
289
290
290
You'll later need to update your PowerShell script to load the variables accordingly:
291
291
@@ -392,7 +392,7 @@ Param(
392
392
393
393
The Automation account should have received a managed identity during provisioning. But if necessary, you can enable a managed identity by using [this procedure](../automation/enable-managed-identity-for-automation.md).
394
394
395
-
Like you did for the function, you need to grant the right permissions on the Stream Analytics job that you want to auto-pause.
395
+
Like you did for the function, you need to grant the right permissions on the Stream Analytics job that you want to automatically pause.
396
396
397
397
To grant the permissions, in the portal area for the Stream Analytics job (not the Automation page), in **Access control (IAM)**, add a role assignment to the role **Contributor** for a member of type **Managed Identity**. Select the name of the Automation account from earlier.
Copy file name to clipboardExpand all lines: articles/stream-analytics/stream-analytics-troubleshoot-input.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -41,7 +41,7 @@ Turn on resource logs to view the details of the error and the message (payload)
41
41
42
42

43
43
44
-
If the message payload is greater than 32 KB or is in binary format, run the *CheckMalformedEvents.cs* code available in the [GitHub samples repository](https://github.com/Azure/azure-stream-analytics/tree/master/Samples/CheckMalformedEventsEH). This code reads the partition ID's offset and prints the data that's located in that offset.
44
+
If the message payload is greater than 32 KB or is in binary format, run the *CheckMalformedEvents.cs* code available in the [GitHub samples repository](https://github.com/Azure/azure-stream-analytics/tree/master/Samples/CheckMalformedEventsEH). This code reads the partition ID offset and prints the data located in that offset.
45
45
46
46
Other common reasons for input deserialization errors are:
0 commit comments