You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/stream-analytics/automation-powershell.md
+16-16Lines changed: 16 additions & 16 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -9,7 +9,7 @@ ms.date: 11/03/2021
9
9
10
10
# Automatically pause a job by using PowerShell and Azure Functions or Azure Automation
11
11
12
-
Some applications require a stream processing approach (such as through [Azure Stream Analytics](./stream-analytics-introduction.md)) but don't strictly need to run continuously. The reasons are various:
12
+
Some applications require a stream processing approach (such as through [Azure Stream Analytics](./stream-analytics-introduction.md)) but don't strictly need to run continuously. The reasons include:
13
13
14
14
- Input data that arrives on a schedule (for example, top of the hour)
15
15
- A sparse or low volume of incoming data (few records per minute)
@@ -42,9 +42,9 @@ As an example, consider that *N* = 5 minutes and *M* = 10 minutes. With these se
42
42
43
43
To restart the job, use the **When Last Stopped**[start option](./start-job.md#start-options). This option tells Stream Analytics to process all the events that were backlogged upstream since the job was stopped.
44
44
45
-
There are two caveats in this situation. First, the job can't stay stopped longer than the retention period of the input stream. If you run the job only once a day, you need to make sure that the [event hub retention period](/azure/event-hubs/event-hubs-faq#what-is-the-maximum-retention-period-for-events-) is more than one day. Second, the job needs to have been started at least once for the mode **When Last Stopped** to be accepted (or else it has literally never been stopped before). So the first run of a job needs to be manual, or you need to extend the script to cover for that case.
45
+
There are two caveats in this situation. First, the job can't stay stopped longer than the retention period of the input stream. If you run the job only once a day, you need to make sure that the [retention period for events](/azure/event-hubs/event-hubs-faq#what-is-the-maximum-retention-period-for-events-) is more than one day. Second, the job needs to have been started at least once for the mode **When Last Stopped** to be accepted (or else it has literally never been stopped before). So the first run of a job needs to be manual, or you need to extend the script to cover for that case.
46
46
47
-
The last consideration is to make these actions idempotent. This way, they can be repeated at will with no side effects, for both ease of use and resiliency.
47
+
The last consideration is to make these actions idempotent. You can then repeat them at will with no side effects, for both ease of use and resiliency.
48
48
49
49
## Components
50
50
@@ -54,18 +54,18 @@ This article anticipates the need to interact with Stream Analytics on the follo
54
54
55
55
- Get the current job status (Stream Analytics resource management):
56
56
- If the job is running:
57
-
- Get the time since started (logs).
57
+
- Get the time since the job started (logs).
58
58
- Get the current metric values (metrics).
59
59
- If applicable, stop the job (Stream Analytics resource management).
60
60
- If the job is stopped:
61
-
- Get the time since stopped (logs).
61
+
- Get the time since the job stopped (logs).
62
62
- If applicable, start the job (Stream Analytics resource management).
63
63
64
-
For Stream Analytics resource management, you can use the [REST API](/rest/api/streamanalytics/), the [.NET SDK](/dotnet/api/microsoft.azure.management.streamanalytics), or one of the CLI libraries ([Azure CLI](/cli/azure/stream-analytics),[PowerShell](/powershell/module/az.streamanalytics)).
64
+
For Stream Analytics resource management, you can use the [REST API](/rest/api/streamanalytics/), the [.NET SDK](/dotnet/api/microsoft.azure.management.streamanalytics), or one of the CLI libraries ([Azure CLI](/cli/azure/stream-analytics) or[PowerShell](/powershell/module/az.streamanalytics)).
65
65
66
66
For metrics and logs, everything in Azure is centralized under [Azure Monitor](../azure-monitor/overview.md), with a similar choice of API surfaces. Logs and metrics are always 1 to 3 minutes behind when you're querying the APIs. So setting *N* at 5 usually means the job runs 6 to 8 minutes in reality.
67
67
68
-
Another consideration is that metrics are always emitted. When the job is stopped, the API returns empty records. You have to clean up the output of your API calls to look at only relevant values.
68
+
Another consideration is that metrics are always emitted. When the job is stopped, the API returns empty records. You have to clean up the output of your API calls to focus on relevant values.
69
69
70
70
### Scripting language
71
71
@@ -82,7 +82,7 @@ In PowerShell, use the [Az PowerShell](/powershell/azure/new-azureps-module-az)
82
82
83
83
To host your PowerShell task, you need a service that offers scheduled runs. There are many options, but here are two serverless ones:
84
84
85
-
-[Azure Functions](../azure-functions/functions-overview.md), a serverless compute engine that can run almost any piece of code. It offers a [timer trigger](../azure-functions/functions-bindings-timer.md?tabs=csharp) that can run up to every second.
85
+
-[Azure Functions](../azure-functions/functions-overview.md), a compute engine that can run almost any piece of code. It offers a [timer trigger](../azure-functions/functions-bindings-timer.md?tabs=csharp) that can run up to every second.
86
86
-[Azure Automation](../automation/overview.md), a managed service for operating cloud workloads and resources. Its purpose is appropriate, but its minimal schedule interval is 1 hour (less with [workarounds](../automation/shared-resources/schedules.md#schedule-runbooks-to-run-more-frequently)).
87
87
88
88
If you don't mind the workarounds, Azure Automation is the easier way to deploy the task. But in this article, you write a local script first so you can compare. After you have a functioning script, you deploy it both in Functions and in an Automation account.
@@ -239,7 +239,7 @@ After you provision the function app, start with its overall configuration.
239
239
240
240
### Managed identity for Azure Functions
241
241
242
-
The function needs permissions to start and stop the Stream Analytics job. You assign these permissions via a [managed identity](../active-directory/managed-identities-azure-resources/overview.md).
242
+
The function needs permissions to start and stop the Stream Analytics job. You assign these permissions by using a [managed identity](../active-directory/managed-identities-azure-resources/overview.md).
243
243
244
244
The first step is to enable a *system-assigned managed identity* for the function, by following [this procedure](../app-service/overview-managed-identity.md?tabs=ps%2cportal&toc=/azure/azure-functions/toc.json).
245
245
@@ -287,7 +287,7 @@ The first step is to follow the [procedure](../azure-functions/functions-how-to-
287
287
|`resourceGroupName`|The resource group name of the Stream Analytics job to be automatically paused.|
288
288
|`asaJobName`|The name of the Stream Analytics job to be automatically paused.|
289
289
290
-
You'll later need to update your PowerShell script to load the variables accordingly:
290
+
Then, update your PowerShell script to load the variables accordingly:
291
291
292
292
```PowerShell
293
293
$maxInputBacklog = $env:maxInputBacklog
@@ -306,7 +306,7 @@ $asaJobName = $env:asaJobName
306
306
307
307
The same way that you had to install Azure PowerShell locally to use the Stream Analytics commands (like `Start-AzStreamAnalyticsJob`), you need to [add it to the function app host](../azure-functions/functions-reference-powershell.md?tabs=portal#dependency-management):
308
308
309
-
1. On the page for the function app, under **Functions**, select **App files**, and then select *requirements.psd1*.
309
+
1. On the page for the function app, under **Functions**, select **App files**, and then select **requirements.psd1**.
310
310
1. Uncomment the line `'Az' = '6.*'`.
311
311
1. To make that change take effect, restart the app.
312
312
@@ -328,15 +328,15 @@ Then, in **Code + Test**, you can copy your script in *run.ps1* and test it. Or
328
328
329
329

330
330
331
-
You can check that everything runs fine via**Test/Run** on the **Code + Test** pane. You can also check the **Monitor** pane, but it's always late by a couple of executions.
331
+
You can check that everything runs fine by selecting**Test/Run** on the **Code + Test** pane. You can also check the **Monitor** pane, but it's always late by a couple of executions.
332
332
333
333

334
334
335
335
### Setting an alert on the function execution
336
336
337
337
Finally, you want to be notified via an alert if the function doesn't run successfully. Alerts have a minor cost, but they might prevent more expensive situations.
338
338
339
-
On the page for the function app, under **Logs**, run the following query that returns all unsuccessful runs in the last 5 minutes:
339
+
On the page for the function app, under **Logs**, run the following query. It returns all unsuccessful runs in the last 5 minutes.
340
340
341
341
```SQL
342
342
requests
@@ -426,9 +426,9 @@ You can now paste your script and test it. You can copy the full script from [Gi
426
426
427
427
You can check that everything is wired properly in **Test pane**.
428
428
429
-
After that, you need to publish the job (via**Publish**) so that you can link the runbook to a schedule. Creating and linking the schedule is a straightforward process. Now is a good time to remember that there are [workarounds](../automation/shared-resources/schedules.md#schedule-runbooks-to-run-more-frequently) to achieve schedule intervals under 1 hour.
429
+
After that, you need to publish the job (by selecting**Publish**) so that you can link the runbook to a schedule. Creating and linking the schedule is a straightforward process. Now is a good time to remember that there are [workarounds](../automation/shared-resources/schedules.md#schedule-runbooks-to-run-more-frequently) to achieve schedule intervals under 1 hour.
430
430
431
-
Finally, you can set up an alert. The first step is to enable logs via the [diagnostic settings](../azure-monitor/essentials/create-diagnostic-settings.md?tabs=cli) of the Automation account. The second step is to capture errors via a query like you did for Functions.
431
+
Finally, you can set up an alert. The first step is to enable logs by using the [diagnostic settings](../azure-monitor/essentials/create-diagnostic-settings.md?tabs=cli) of the Automation account. The second step is to capture errors by using a query like you did for Functions.
432
432
433
433
## Outcome
434
434
@@ -442,7 +442,7 @@ And here are the metrics:
442
442
443
443

444
444
445
-
After you understand the script, reworking it to extend its scope is a straightforward task. You can easily update the script to target a list of jobs instead of a single one. You can define and process larger scopes via tags, resource groups, or even entire subscriptions.
445
+
After you understand the script, reworking it to extend its scope is a straightforward task. You can easily update the script to target a list of jobs instead of a single one. You can define and process larger scopes by using tags, resource groups, or even entire subscriptions.
Copy file name to clipboardExpand all lines: articles/stream-analytics/stream-analytics-real-time-event-processing-reference-architecture.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -14,11 +14,11 @@ The reference architecture for real-time event processing with Azure Stream Anal
14
14
15
15
Traditionally, analytics solutions are based on capabilities such as ETL (extract, transform, load) and data warehousing, where data is stored before analysis. Changing requirements, including more rapidly arriving data, are pushing this existing model to the limit.
16
16
17
-
The ability to analyze data within moving streams before storage is one solution. Although it isn't a new capability, the approach hasn't been widely adopted across industry verticals.
17
+
The ability to analyze data within moving streams before storage is one solution. Although this approach isn't new, it hasn't been widely adopted across industry verticals.
18
18
19
19
Microsoft Azure provides an extensive catalog of analytics technologies that can support an array of solution scenarios and requirements. Selecting which Azure services to deploy for an end-to-end solution can be a challenge, considering the breadth of offerings.
20
20
21
-
This reference is designed to describe the capabilities and interoperation of the various Azure services that support an event-streaming solution. It also explains some of the scenarios in which customers can benefit from this type of approach.
21
+
This reference describes the capabilities and interoperation of the Azure services that support an event-streaming solution. It also explains some of the scenarios in which customers can benefit from this type of approach.
Copy file name to clipboardExpand all lines: articles/stream-analytics/stream-analytics-troubleshoot-input.md
+8-6Lines changed: 8 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -10,7 +10,9 @@ ms.date: 12/15/2023
10
10
11
11
# Troubleshoot input connections
12
12
13
-
This article describes common problems with Azure Stream Analytics input connections, how to troubleshoot input problems, and how to correct the problems. Many troubleshooting steps require you to turn on resource logs for your Stream Analytics job. If you don't have resource logs turned on, see [Troubleshoot Azure Stream Analytics by using resource logs](stream-analytics-job-diagnostic-logs.md).
13
+
This article describes common problems with Azure Stream Analytics input connections, how to troubleshoot those problems, and how to correct them.
14
+
15
+
Many troubleshooting steps require you to turn on resource logs for your Stream Analytics job. If you don't have resource logs turned on, see [Troubleshoot Azure Stream Analytics by using resource logs](stream-analytics-job-diagnostic-logs.md).
14
16
15
17
## Job doesn't receive input events
16
18
@@ -26,18 +28,18 @@ This article describes common problems with Azure Stream Analytics input connect
26
28
27
29
3. Ensure that you selected a time range in the input preview. Choose **Select time range**, and then enter a sample duration before testing your query.
28
30
29
-
> [!IMPORTANT]
30
-
> For [Azure Stream Analytics jobs](./run-job-in-virtual-network.md) that aren't network injected, don't rely on the source IP address of connections coming from Stream Analytics in any way. They can be public or private IPs, depending on service infrastructure operations that happen from time to time.
31
+
> [!IMPORTANT]
32
+
> For [Azure Stream Analytics jobs](./run-job-in-virtual-network.md) that aren't network injected, don't rely on the source IP address of connections coming from Stream Analytics in any way. They can be public or private IPs, depending on service infrastructure operations that happen from time to time.
31
33
32
34
## Malformed input events cause deserialization errors
33
35
34
-
Deserialization problems happen when the input stream of your Stream Analytics job contains malformed messages. For example, a missing parenthesis or brace in a JSON object, or an incorrect timestamp format in the time field, can cause a malformed message.
36
+
Deserialization problems happen when the input stream of your Stream Analytics job contains malformed messages. For example, a missing parenthesis or brace in a JSON object, or an incorrect time-stamp format in the time field, can cause a malformed message.
35
37
36
-
When a Stream Analytics job receives a malformed message from an input, it drops the message and notifies you with a warning. A warning symbol appears on the **Inputs** tile of your Stream Analytics job. The warning symbol exists as long as the job is in running state.
38
+
When a Stream Analytics job receives a malformed message from an input, it drops the message and notifies you with a warning. A warning symbol appears on the **Inputs** tile of your Stream Analytics job. The warning symbol exists as long as the job is in a running state.
37
39
38
40

39
41
40
-
Turn on resource logs to view the details of the error and the message (payload) that caused the error. There are multiple reasons why deserialization errors can occur. For more information about specific deserialization errors, see [Input data errors](data-errors.md#input-data-errors). If you don't turn on resource logs, a brief notification appears in the Azure portal.
42
+
Turn on resource logs to view the details of the error and the message (payload) that caused the error. There are multiple reasons why deserialization errors can occur. For more information about specific deserialization errors, see [Input data errors](data-errors.md#input-data-errors). If resource logs aren't turned on, a brief notification appears in the Azure portal.
41
43
42
44

Copy file name to clipboardExpand all lines: includes/machine-learning-studio-classic-deprecation.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -10,7 +10,7 @@ ms.date: 08/23/2021
10
10
> [!IMPORTANT]
11
11
> Support for Azure Machine Learning Studio (classic) will end on August 31, 2024. We recommend that you transition to [Azure Machine Learning](https://azure.microsoft.com/services/machine-learning/) by that date.
12
12
>
13
-
> As of December 1, 2021, you can't create new Machine Learning Studio (classic) resources (workspace and web service plan). Through August 31,2024, you can continue to use the existing Machine Learning Studio (classic) experiments and web services. For more information, see:
13
+
> As of December 1, 2021, you can't create new Machine Learning Studio (classic) resources (workspace and web service plan). Through August 31,2024, you can continue to use the existing Machine Learning Studio (classic) experiments and web services. For more information, see:
14
14
>
15
15
> -[Migrate to Azure Machine Learning from Machine Learning Studio (classic)](../articles/machine-learning/v1/migrate-overview.md)
16
16
> -[What is Azure Machine Learning?](../articles/machine-learning/overview-what-is-azure-machine-learning.md)
0 commit comments