Skip to content

Commit ae10cae

Browse files
committed
edit pass: improve-stream-analytics-acrolinx-scores
1 parent 85009c4 commit ae10cae

File tree

4 files changed

+27
-25
lines changed

4 files changed

+27
-25
lines changed

articles/stream-analytics/automation-powershell.md

Lines changed: 16 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ ms.date: 11/03/2021
99

1010
# Automatically pause a job by using PowerShell and Azure Functions or Azure Automation
1111

12-
Some applications require a stream processing approach (such as through [Azure Stream Analytics](./stream-analytics-introduction.md)) but don't strictly need to run continuously. The reasons are various:
12+
Some applications require a stream processing approach (such as through [Azure Stream Analytics](./stream-analytics-introduction.md)) but don't strictly need to run continuously. The reasons include:
1313

1414
- Input data that arrives on a schedule (for example, top of the hour)
1515
- A sparse or low volume of incoming data (few records per minute)
@@ -42,9 +42,9 @@ As an example, consider that *N* = 5 minutes and *M* = 10 minutes. With these se
4242

4343
To restart the job, use the **When Last Stopped** [start option](./start-job.md#start-options). This option tells Stream Analytics to process all the events that were backlogged upstream since the job was stopped.
4444

45-
There are two caveats in this situation. First, the job can't stay stopped longer than the retention period of the input stream. If you run the job only once a day, you need to make sure that the [event hub retention period](/azure/event-hubs/event-hubs-faq#what-is-the-maximum-retention-period-for-events-) is more than one day. Second, the job needs to have been started at least once for the mode **When Last Stopped** to be accepted (or else it has literally never been stopped before). So the first run of a job needs to be manual, or you need to extend the script to cover for that case.
45+
There are two caveats in this situation. First, the job can't stay stopped longer than the retention period of the input stream. If you run the job only once a day, you need to make sure that the [retention period for events](/azure/event-hubs/event-hubs-faq#what-is-the-maximum-retention-period-for-events-) is more than one day. Second, the job needs to have been started at least once for the mode **When Last Stopped** to be accepted (or else it has literally never been stopped before). So the first run of a job needs to be manual, or you need to extend the script to cover for that case.
4646

47-
The last consideration is to make these actions idempotent. This way, they can be repeated at will with no side effects, for both ease of use and resiliency.
47+
The last consideration is to make these actions idempotent. You can then repeat them at will with no side effects, for both ease of use and resiliency.
4848

4949
## Components
5050

@@ -54,18 +54,18 @@ This article anticipates the need to interact with Stream Analytics on the follo
5454

5555
- Get the current job status (Stream Analytics resource management):
5656
- If the job is running:
57-
- Get the time since started (logs).
57+
- Get the time since the job started (logs).
5858
- Get the current metric values (metrics).
5959
- If applicable, stop the job (Stream Analytics resource management).
6060
- If the job is stopped:
61-
- Get the time since stopped (logs).
61+
- Get the time since the job stopped (logs).
6262
- If applicable, start the job (Stream Analytics resource management).
6363

64-
For Stream Analytics resource management, you can use the [REST API](/rest/api/streamanalytics/), the [.NET SDK](/dotnet/api/microsoft.azure.management.streamanalytics), or one of the CLI libraries ([Azure CLI](/cli/azure/stream-analytics), [PowerShell](/powershell/module/az.streamanalytics)).
64+
For Stream Analytics resource management, you can use the [REST API](/rest/api/streamanalytics/), the [.NET SDK](/dotnet/api/microsoft.azure.management.streamanalytics), or one of the CLI libraries ([Azure CLI](/cli/azure/stream-analytics) or [PowerShell](/powershell/module/az.streamanalytics)).
6565

6666
For metrics and logs, everything in Azure is centralized under [Azure Monitor](../azure-monitor/overview.md), with a similar choice of API surfaces. Logs and metrics are always 1 to 3 minutes behind when you're querying the APIs. So setting *N* at 5 usually means the job runs 6 to 8 minutes in reality.
6767

68-
Another consideration is that metrics are always emitted. When the job is stopped, the API returns empty records. You have to clean up the output of your API calls to look at only relevant values.
68+
Another consideration is that metrics are always emitted. When the job is stopped, the API returns empty records. You have to clean up the output of your API calls to focus on relevant values.
6969

7070
### Scripting language
7171

@@ -82,7 +82,7 @@ In PowerShell, use the [Az PowerShell](/powershell/azure/new-azureps-module-az)
8282

8383
To host your PowerShell task, you need a service that offers scheduled runs. There are many options, but here are two serverless ones:
8484

85-
- [Azure Functions](../azure-functions/functions-overview.md), a serverless compute engine that can run almost any piece of code. It offers a [timer trigger](../azure-functions/functions-bindings-timer.md?tabs=csharp) that can run up to every second.
85+
- [Azure Functions](../azure-functions/functions-overview.md), a compute engine that can run almost any piece of code. It offers a [timer trigger](../azure-functions/functions-bindings-timer.md?tabs=csharp) that can run up to every second.
8686
- [Azure Automation](../automation/overview.md), a managed service for operating cloud workloads and resources. Its purpose is appropriate, but its minimal schedule interval is 1 hour (less with [workarounds](../automation/shared-resources/schedules.md#schedule-runbooks-to-run-more-frequently)).
8787

8888
If you don't mind the workarounds, Azure Automation is the easier way to deploy the task. But in this article, you write a local script first so you can compare. After you have a functioning script, you deploy it both in Functions and in an Automation account.
@@ -239,7 +239,7 @@ After you provision the function app, start with its overall configuration.
239239

240240
### Managed identity for Azure Functions
241241

242-
The function needs permissions to start and stop the Stream Analytics job. You assign these permissions via a [managed identity](../active-directory/managed-identities-azure-resources/overview.md).
242+
The function needs permissions to start and stop the Stream Analytics job. You assign these permissions by using a [managed identity](../active-directory/managed-identities-azure-resources/overview.md).
243243

244244
The first step is to enable a *system-assigned managed identity* for the function, by following [this procedure](../app-service/overview-managed-identity.md?tabs=ps%2cportal&toc=/azure/azure-functions/toc.json).
245245

@@ -287,7 +287,7 @@ The first step is to follow the [procedure](../azure-functions/functions-how-to-
287287
|`resourceGroupName`|The resource group name of the Stream Analytics job to be automatically paused.|
288288
|`asaJobName`|The name of the Stream Analytics job to be automatically paused.|
289289

290-
You'll later need to update your PowerShell script to load the variables accordingly:
290+
Then, update your PowerShell script to load the variables accordingly:
291291

292292
```PowerShell
293293
$maxInputBacklog = $env:maxInputBacklog
@@ -306,7 +306,7 @@ $asaJobName = $env:asaJobName
306306

307307
The same way that you had to install Azure PowerShell locally to use the Stream Analytics commands (like `Start-AzStreamAnalyticsJob`), you need to [add it to the function app host](../azure-functions/functions-reference-powershell.md?tabs=portal#dependency-management):
308308

309-
1. On the page for the function app, under **Functions**, select **App files**, and then select *requirements.psd1*.
309+
1. On the page for the function app, under **Functions**, select **App files**, and then select **requirements.psd1**.
310310
1. Uncomment the line `'Az' = '6.*'`.
311311
1. To make that change take effect, restart the app.
312312

@@ -328,15 +328,15 @@ Then, in **Code + Test**, you can copy your script in *run.ps1* and test it. Or
328328

329329
![Screenshot of the Code+Test pane for the function.](./media/automation/function-code.png)
330330

331-
You can check that everything runs fine via **Test/Run** on the **Code + Test** pane. You can also check the **Monitor** pane, but it's always late by a couple of executions.
331+
You can check that everything runs fine by selecting **Test/Run** on the **Code + Test** pane. You can also check the **Monitor** pane, but it's always late by a couple of executions.
332332

333333
![Screenshot of the output of a successful run.](./media/automation/function-run.png)
334334

335335
### Setting an alert on the function execution
336336

337337
Finally, you want to be notified via an alert if the function doesn't run successfully. Alerts have a minor cost, but they might prevent more expensive situations.
338338

339-
On the page for the function app, under **Logs**, run the following query that returns all unsuccessful runs in the last 5 minutes:
339+
On the page for the function app, under **Logs**, run the following query. It returns all unsuccessful runs in the last 5 minutes.
340340

341341
```SQL
342342
requests
@@ -426,9 +426,9 @@ You can now paste your script and test it. You can copy the full script from [Gi
426426

427427
You can check that everything is wired properly in **Test pane**.
428428

429-
After that, you need to publish the job (via **Publish**) so that you can link the runbook to a schedule. Creating and linking the schedule is a straightforward process. Now is a good time to remember that there are [workarounds](../automation/shared-resources/schedules.md#schedule-runbooks-to-run-more-frequently) to achieve schedule intervals under 1 hour.
429+
After that, you need to publish the job (by selecting **Publish**) so that you can link the runbook to a schedule. Creating and linking the schedule is a straightforward process. Now is a good time to remember that there are [workarounds](../automation/shared-resources/schedules.md#schedule-runbooks-to-run-more-frequently) to achieve schedule intervals under 1 hour.
430430

431-
Finally, you can set up an alert. The first step is to enable logs via the [diagnostic settings](../azure-monitor/essentials/create-diagnostic-settings.md?tabs=cli) of the Automation account. The second step is to capture errors via a query like you did for Functions.
431+
Finally, you can set up an alert. The first step is to enable logs by using the [diagnostic settings](../azure-monitor/essentials/create-diagnostic-settings.md?tabs=cli) of the Automation account. The second step is to capture errors by using a query like you did for Functions.
432432

433433
## Outcome
434434

@@ -442,7 +442,7 @@ And here are the metrics:
442442

443443
![Screenshot of the metrics of the Stream Analytics job.](./media/automation/asa-metrics.png)
444444

445-
After you understand the script, reworking it to extend its scope is a straightforward task. You can easily update the script to target a list of jobs instead of a single one. You can define and process larger scopes via tags, resource groups, or even entire subscriptions.
445+
After you understand the script, reworking it to extend its scope is a straightforward task. You can easily update the script to target a list of jobs instead of a single one. You can define and process larger scopes by using tags, resource groups, or even entire subscriptions.
446446

447447
## Get support
448448

articles/stream-analytics/stream-analytics-real-time-event-processing-reference-architecture.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -14,11 +14,11 @@ The reference architecture for real-time event processing with Azure Stream Anal
1414

1515
Traditionally, analytics solutions are based on capabilities such as ETL (extract, transform, load) and data warehousing, where data is stored before analysis. Changing requirements, including more rapidly arriving data, are pushing this existing model to the limit.
1616

17-
The ability to analyze data within moving streams before storage is one solution. Although it isn't a new capability, the approach hasn't been widely adopted across industry verticals.
17+
The ability to analyze data within moving streams before storage is one solution. Although this approach isn't new, it hasn't been widely adopted across industry verticals.
1818

1919
Microsoft Azure provides an extensive catalog of analytics technologies that can support an array of solution scenarios and requirements. Selecting which Azure services to deploy for an end-to-end solution can be a challenge, considering the breadth of offerings.
2020

21-
This reference is designed to describe the capabilities and interoperation of the various Azure services that support an event-streaming solution. It also explains some of the scenarios in which customers can benefit from this type of approach.
21+
This reference describes the capabilities and interoperation of the Azure services that support an event-streaming solution. It also explains some of the scenarios in which customers can benefit from this type of approach.
2222

2323
## Contents
2424

articles/stream-analytics/stream-analytics-troubleshoot-input.md

Lines changed: 8 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,9 @@ ms.date: 12/15/2023
1010

1111
# Troubleshoot input connections
1212

13-
This article describes common problems with Azure Stream Analytics input connections, how to troubleshoot input problems, and how to correct the problems. Many troubleshooting steps require you to turn on resource logs for your Stream Analytics job. If you don't have resource logs turned on, see [Troubleshoot Azure Stream Analytics by using resource logs](stream-analytics-job-diagnostic-logs.md).
13+
This article describes common problems with Azure Stream Analytics input connections, how to troubleshoot those problems, and how to correct them.
14+
15+
Many troubleshooting steps require you to turn on resource logs for your Stream Analytics job. If you don't have resource logs turned on, see [Troubleshoot Azure Stream Analytics by using resource logs](stream-analytics-job-diagnostic-logs.md).
1416

1517
## Job doesn't receive input events
1618

@@ -26,18 +28,18 @@ This article describes common problems with Azure Stream Analytics input connect
2628

2729
3. Ensure that you selected a time range in the input preview. Choose **Select time range**, and then enter a sample duration before testing your query.
2830

29-
> [!IMPORTANT]
30-
> For [Azure Stream Analytics jobs](./run-job-in-virtual-network.md) that aren't network injected, don't rely on the source IP address of connections coming from Stream Analytics in any way. They can be public or private IPs, depending on service infrastructure operations that happen from time to time.
31+
> [!IMPORTANT]
32+
> For [Azure Stream Analytics jobs](./run-job-in-virtual-network.md) that aren't network injected, don't rely on the source IP address of connections coming from Stream Analytics in any way. They can be public or private IPs, depending on service infrastructure operations that happen from time to time.
3133
3234
## Malformed input events cause deserialization errors
3335

34-
Deserialization problems happen when the input stream of your Stream Analytics job contains malformed messages. For example, a missing parenthesis or brace in a JSON object, or an incorrect timestamp format in the time field, can cause a malformed message.
36+
Deserialization problems happen when the input stream of your Stream Analytics job contains malformed messages. For example, a missing parenthesis or brace in a JSON object, or an incorrect time-stamp format in the time field, can cause a malformed message.
3537

36-
When a Stream Analytics job receives a malformed message from an input, it drops the message and notifies you with a warning. A warning symbol appears on the **Inputs** tile of your Stream Analytics job. The warning symbol exists as long as the job is in running state.
38+
When a Stream Analytics job receives a malformed message from an input, it drops the message and notifies you with a warning. A warning symbol appears on the **Inputs** tile of your Stream Analytics job. The warning symbol exists as long as the job is in a running state.
3739

3840
![Screenshot that shows the Inputs tile for Azure Stream Analytics.](media/stream-analytics-malformed-events/stream-analytics-inputs-tile.png)
3941

40-
Turn on resource logs to view the details of the error and the message (payload) that caused the error. There are multiple reasons why deserialization errors can occur. For more information about specific deserialization errors, see [Input data errors](data-errors.md#input-data-errors). If you don't turn on resource logs, a brief notification appears in the Azure portal.
42+
Turn on resource logs to view the details of the error and the message (payload) that caused the error. There are multiple reasons why deserialization errors can occur. For more information about specific deserialization errors, see [Input data errors](data-errors.md#input-data-errors). If resource logs aren't turned on, a brief notification appears in the Azure portal.
4143

4244
![Screenshot that shows a warning notification about input details.](media/stream-analytics-malformed-events/warning-message-with-offset.png)
4345

includes/machine-learning-studio-classic-deprecation.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ ms.date: 08/23/2021
1010
> [!IMPORTANT]
1111
> Support for Azure Machine Learning Studio (classic) will end on August 31, 2024. We recommend that you transition to [Azure Machine Learning](https://azure.microsoft.com/services/machine-learning/) by that date.
1212
>
13-
> As of December 1, 2021, you can't create new Machine Learning Studio (classic) resources (workspace and web service plan). Through August 31,2024, you can continue to use the existing Machine Learning Studio (classic) experiments and web services. For more information, see:
13+
> As of December 1, 2021, you can't create new Machine Learning Studio (classic) resources (workspace and web service plan). Through August 31, 2024, you can continue to use the existing Machine Learning Studio (classic) experiments and web services. For more information, see:
1414
>
1515
> - [Migrate to Azure Machine Learning from Machine Learning Studio (classic)](../articles/machine-learning/v1/migrate-overview.md)
1616
> - [What is Azure Machine Learning?](../articles/machine-learning/overview-what-is-azure-machine-learning.md)

0 commit comments

Comments
 (0)