Skip to content

Commit 5b7bbcd

Browse files
authored
Merge pull request #176100 from MicrosoftDocs/master
Sunday 4pm publish to live
2 parents 4f6a241 + 13dc4b6 commit 5b7bbcd

File tree

7 files changed

+122
-131
lines changed

7 files changed

+122
-131
lines changed

articles/azure-monitor/logs/log-analytics-workspace-insights-overview.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -9,9 +9,9 @@ ms.date: 05/06/2021
99

1010
---
1111

12-
# Log Analytics Workspace Insights (preview)
12+
# Log Analytics Workspace Insights
1313

14-
Log Analytics Workspace Insights (preview) provides comprehensive monitoring of your workspaces through a unified view of your workspace usage, performance, health, agent, queries, and change log. This article will help you understand how to onboard and use Log Analytics Workspace Insights (preview).
14+
Log Analytics Workspace Insights provides comprehensive monitoring of your workspaces through a unified view of your workspace usage, performance, health, agent, queries, and change log. This article will help you understand how to onboard and use Log Analytics Workspace Insights.
1515

1616
## Overview your Log Analytics workspaces
1717

@@ -25,7 +25,7 @@ To launch Log Analytics Workspace Insights at scale, perform the following steps
2525

2626
1. Sign into the [Azure portal](https://portal.azure.com/)
2727

28-
2. Select **Monitor** from the left-hand pane in the Azure portal, and under the Insights Hub section, select **Log Analytics Workspace Insights (preview)**.
28+
2. Select **Monitor** from the left-hand pane in the Azure portal, and under the Insights Hub section, select **Log Analytics Workspace Insights**.
2929

3030
## View Insights for a Log Analytics workspace
3131

articles/azure-monitor/logs/manage-cost-storage.md

Lines changed: 9 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ ms.service: azure-monitor
1111
ms.workload: na
1212
ms.tgt_pltfrm: na
1313
ms.topic: conceptual
14-
ms.date: 10/12/2021
14+
ms.date: 10/17/2021
1515
ms.author: bwren
1616
ms.custom: devx-track-azurepowershell
1717
---
@@ -133,6 +133,10 @@ New-AzResourceGroupDeployment -ResourceGroupName "YourResourceGroupName" -Templa
133133

134134
To set the pricing tier to other values such as Pay-As-You-Go (called `pergb2018` for the SKU), omit the `capacityReservationLevel` property. Learn more about [creating ARM templates](../../azure-resource-manager/templates/template-tutorial-create-first-template.md), [adding a resource to your template](../../azure-resource-manager/templates/template-tutorial-add-resource.md), and [applying templates](../resource-manager-samples.md).
135135

136+
### Tracking pricing tier changes
137+
138+
Changes to a workspace's pricing pier are recorded in the [Activity Log](../essentials/activity-log.md) with an event with the Operation named "Create Workspace". The event's **Change history** tab will show the old and new pricing tiers in the `properties.sku.name` row. Click the "Activity Log" option from your workspace to see events scoped to a particular workspace. To monitor changes the pricing tier, you can create an alert for the "Create Workspace" operation.
139+
136140
## Legacy pricing tiers
137141

138142
Subscriptions that contained a Log Analytics workspace or Application Insights resource on April 2, 2018, or are linked to an Enterprise Agreement that started before February 1, 2019 and is still active, will continue to have access to use the legacy pricing tiers: **Free Trial**, **Standalone (Per GB)**, and **Per Node (OMS)**. Workspaces in the Free Trial pricing tier will have daily data ingestion limited to 500 MB (except for security data types collected by [Azure Defender (Security Center)](../../security-center/index.yml)) and the data retention is limited to seven days. The Free Trial pricing tier is intended only for evaluation purposes. No SLA is provided for the Free tier. Workspaces in the Standalone or Per Node pricing tiers have user-configurable retention from 30 to 730 days.
@@ -288,14 +292,15 @@ To view the effect of the daily cap, it's important to account for the security
288292
```kusto
289293
let DailyCapResetHour=14;
290294
Usage
291-
| where Type !in ("SecurityAlert", "SecurityBaseline", "SecurityBaselineSummary", "SecurityDetection", "SecurityEvent", "WindowsFirewall", "MaliciousIPCommunication", "LinuxAuditLog", "SysmonEvent", "ProtectionStatus", "WindowsEvent")
295+
| where DataType !in ("SecurityAlert", "SecurityBaseline", "SecurityBaselineSummary", "SecurityDetection", "SecurityEvent", "WindowsFirewall", "MaliciousIPCommunication", "LinuxAuditLog", "SysmonEvent", "ProtectionStatus", "WindowsEvent")
292296
| extend TimeGenerated=datetime_add("hour",-1*DailyCapResetHour,TimeGenerated)
293297
| where TimeGenerated > startofday(ago(31d))
294298
| where IsBillable
295-
| summarize IngestedGbBetweenDailyCapResets=sum(Quantity)/1000. by day=bin(TimeGenerated, 1d) | render areachart
299+
| summarize IngestedGbBetweenDailyCapResets=sum(Quantity)/1000. by day=bin(TimeGenerated, 1d) // Quantity in units of MB
300+
| render areachart
296301
```
302+
Add `Update` and `UpdateSummary` data types to the `where Datatype` line when the Update Management solution is not running on the workspace or solution targeting is enabled ([learn more](../../security-center/security-center-pricing.md#what-data-types-are-included-in-the-500-mb-data-daily-allowance).)
297303

298-
(In the Usage data type, the units of `Quantity` are in MB.)
299304

300305
### Alert when daily cap is reached
301306

articles/cognitive-services/Speech-Service/how-to-custom-speech-test-and-train.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -86,11 +86,11 @@ Use either of the following requests to create and upload a dataset:
8686

8787
**REST API created datasets and Speech Studio projects**
8888

89-
A dataset created via Speech-to-text REST API v3.0 will *not* be connected to any of the Speech Studio projects, unless a special parameter is specified in the request body (see below). Connection with a Speech Studio project is *not* required for any model customization operations, if they are performed via the REST API.
89+
A dataset created with the Speech-to-text REST API v3.0 will *not* be connected to any of the Speech Studio projects, unless a special parameter is specified in the request body (see below). Connection with a Speech Studio project is *not* required for any model customization operations, if they are performed via the REST API.
9090

91-
When you log on to the Speech Studio, its user interface will notify you when any unconnected object is found (like datasets uploaded via REST API without any project reference) and offer to connect such objects to an existing project.
91+
When you log on to the Speech Studio, its user interface will notify you when any unconnected object is found (like datasets uploaded through the REST API without any project reference) and offer to connect such objects to an existing project.
9292

93-
To connect the new dataset to an existing project in the Speech Studio during its upload using [Create Dataset](https://centralus.dev.cognitive.microsoft.com/docs/services/speech-to-text-api-v3-0/operations/CreateDataset) or [Create Dataset from Form](https://centralus.dev.cognitive.microsoft.com/docs/services/speech-to-text-api-v3-0/operations/UploadDatasetFromForm) requests, use request body format, like in the example below:
93+
To connect the new dataset to an existing project in the Speech Studio during its upload, use [Create Dataset](https://centralus.dev.cognitive.microsoft.com/docs/services/speech-to-text-api-v3-0/operations/CreateDataset) or [Create Dataset from Form](https://centralus.dev.cognitive.microsoft.com/docs/services/speech-to-text-api-v3-0/operations/UploadDatasetFromForm) and fill out the request body according to the following format:
9494
```json
9595
{
9696
"kind": "Acoustic",
@@ -104,7 +104,7 @@ To connect the new dataset to an existing project in the Speech Studio during it
104104
}
105105
```
106106

107-
Project URL required for `project` element can be obtained via [Get Projects](https://westeurope.dev.cognitive.microsoft.com/docs/services/speech-to-text-api-v3-0/operations/GetProjects) request.
107+
The Project URL required for the `project` element can be obtained with the [Get Projects](https://westeurope.dev.cognitive.microsoft.com/docs/services/speech-to-text-api-v3-0/operations/GetProjects) request.
108108

109109
## Audio + human-labeled transcript data for training/testing
110110

articles/ddos-protection/test-through-simulations.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,7 @@ We have partnered with [BreakingPoint Cloud](https://www.ixiacom.com/products/br
3131
## Prerequisites
3232

3333
- Before you can complete the steps in this tutorial, you must first create a [Azure DDoS Standard protection plan](manage-ddos-protection.md) with protected public IP addresses.
34-
- You must first create an account with [BreakingPoint Cloud](http://breakingpoint.cloud/).
34+
- You must first create an account with [BreakingPoint Cloud](https://www.ixiacom.com/products/breakingpoint-cloud).
3535

3636
## Configure a DDoS test attack
3737

@@ -62,7 +62,7 @@ Once the resource is under attack, you should see that the value changes from **
6262

6363
### BreakingPoint Cloud API Script
6464

65-
This [API script](https://aka.ms/ddosbreakingpoint) can be used to automate DDoS testing by running once or using cron to schedule regular tests. This is useful to validate that your logging is configured properly and that detection and response procedures are effective. The scripts require a Linux OS (tested with Ubuntu 18.04 LTS) and Python 3. Install prerequisites and API client using the included script or by using the documentation on the [BreakingPoint Cloud](http://breakingpoint.cloud/) website.
65+
This [API script](https://aka.ms/ddosbreakingpoint) can be used to automate DDoS testing by running once or using cron to schedule regular tests. This is useful to validate that your logging is configured properly and that detection and response procedures are effective. The scripts require a Linux OS (tested with Ubuntu 18.04 LTS) and Python 3. Install prerequisites and API client using the included script or by using the documentation on the [BreakingPoint Cloud](https://www.ixiacom.com/products/breakingpoint-cloud) website.
6666

6767
## Next steps
6868

0 commit comments

Comments
 (0)