You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/azure-monitor/logs/log-analytics-workspace-insights-overview.md
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -9,9 +9,9 @@ ms.date: 05/06/2021
9
9
10
10
---
11
11
12
-
# Log Analytics Workspace Insights (preview)
12
+
# Log Analytics Workspace Insights
13
13
14
-
Log Analytics Workspace Insights (preview) provides comprehensive monitoring of your workspaces through a unified view of your workspace usage, performance, health, agent, queries, and change log. This article will help you understand how to onboard and use Log Analytics Workspace Insights (preview).
14
+
Log Analytics Workspace Insights provides comprehensive monitoring of your workspaces through a unified view of your workspace usage, performance, health, agent, queries, and change log. This article will help you understand how to onboard and use Log Analytics Workspace Insights.
15
15
16
16
## Overview your Log Analytics workspaces
17
17
@@ -25,7 +25,7 @@ To launch Log Analytics Workspace Insights at scale, perform the following steps
25
25
26
26
1. Sign into the [Azure portal](https://portal.azure.com/)
27
27
28
-
2. Select **Monitor** from the left-hand pane in the Azure portal, and under the Insights Hub section, select **Log Analytics Workspace Insights (preview)**.
28
+
2. Select **Monitor** from the left-hand pane in the Azure portal, and under the Insights Hub section, select **Log Analytics Workspace Insights**.
To set the pricing tier to other values such as Pay-As-You-Go (called `pergb2018` for the SKU), omit the `capacityReservationLevel` property. Learn more about [creating ARM templates](../../azure-resource-manager/templates/template-tutorial-create-first-template.md), [adding a resource to your template](../../azure-resource-manager/templates/template-tutorial-add-resource.md), and [applying templates](../resource-manager-samples.md).
135
135
136
+
### Tracking pricing tier changes
137
+
138
+
Changes to a workspace's pricing pier are recorded in the [Activity Log](../essentials/activity-log.md) with an event with the Operation named "Create Workspace". The event's **Change history** tab will show the old and new pricing tiers in the `properties.sku.name` row. Click the "Activity Log" option from your workspace to see events scoped to a particular workspace. To monitor changes the pricing tier, you can create an alert for the "Create Workspace" operation.
139
+
136
140
## Legacy pricing tiers
137
141
138
142
Subscriptions that contained a Log Analytics workspace or Application Insights resource on April 2, 2018, or are linked to an Enterprise Agreement that started before February 1, 2019 and is still active, will continue to have access to use the legacy pricing tiers: **Free Trial**, **Standalone (Per GB)**, and **Per Node (OMS)**. Workspaces in the Free Trial pricing tier will have daily data ingestion limited to 500 MB (except for security data types collected by [Azure Defender (Security Center)](../../security-center/index.yml)) and the data retention is limited to seven days. The Free Trial pricing tier is intended only for evaluation purposes. No SLA is provided for the Free tier. Workspaces in the Standalone or Per Node pricing tiers have user-configurable retention from 30 to 730 days.
@@ -288,14 +292,15 @@ To view the effect of the daily cap, it's important to account for the security
288
292
```kusto
289
293
let DailyCapResetHour=14;
290
294
Usage
291
-
| where Type !in ("SecurityAlert", "SecurityBaseline", "SecurityBaselineSummary", "SecurityDetection", "SecurityEvent", "WindowsFirewall", "MaliciousIPCommunication", "LinuxAuditLog", "SysmonEvent", "ProtectionStatus", "WindowsEvent")
| summarize IngestedGbBetweenDailyCapResets=sum(Quantity)/1000. by day=bin(TimeGenerated, 1d) | render areachart
299
+
| summarize IngestedGbBetweenDailyCapResets=sum(Quantity)/1000. by day=bin(TimeGenerated, 1d) // Quantity in units of MB
300
+
| render areachart
296
301
```
302
+
Add `Update` and `UpdateSummary` data types to the `where Datatype` line when the Update Management solution is not running on the workspace or solution targeting is enabled ([learn more](../../security-center/security-center-pricing.md#what-data-types-are-included-in-the-500-mb-data-daily-allowance).)
297
303
298
-
(In the Usage data type, the units of `Quantity` are in MB.)
Copy file name to clipboardExpand all lines: articles/cognitive-services/Speech-Service/how-to-custom-speech-test-and-train.md
+4-4Lines changed: 4 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -86,11 +86,11 @@ Use either of the following requests to create and upload a dataset:
86
86
87
87
**REST API created datasets and Speech Studio projects**
88
88
89
-
A dataset created via Speech-to-text REST API v3.0 will *not* be connected to any of the Speech Studio projects, unless a special parameter is specified in the request body (see below). Connection with a Speech Studio project is *not* required for any model customization operations, if they are performed via the REST API.
89
+
A dataset created with the Speech-to-text REST API v3.0 will *not* be connected to any of the Speech Studio projects, unless a special parameter is specified in the request body (see below). Connection with a Speech Studio project is *not* required for any model customization operations, if they are performed via the REST API.
90
90
91
-
When you log on to the Speech Studio, its user interface will notify you when any unconnected object is found (like datasets uploaded via REST API without any project reference) and offer to connect such objects to an existing project.
91
+
When you log on to the Speech Studio, its user interface will notify you when any unconnected object is found (like datasets uploaded through the REST API without any project reference) and offer to connect such objects to an existing project.
92
92
93
-
To connect the new dataset to an existing project in the Speech Studio during its upload using[Create Dataset](https://centralus.dev.cognitive.microsoft.com/docs/services/speech-to-text-api-v3-0/operations/CreateDataset) or [Create Dataset from Form](https://centralus.dev.cognitive.microsoft.com/docs/services/speech-to-text-api-v3-0/operations/UploadDatasetFromForm)requests, use request body format, like in the example below:
93
+
To connect the new dataset to an existing project in the Speech Studio during its upload, use[Create Dataset](https://centralus.dev.cognitive.microsoft.com/docs/services/speech-to-text-api-v3-0/operations/CreateDataset) or [Create Dataset from Form](https://centralus.dev.cognitive.microsoft.com/docs/services/speech-to-text-api-v3-0/operations/UploadDatasetFromForm)and fill out the request body according to the following format:
94
94
```json
95
95
{
96
96
"kind": "Acoustic",
@@ -104,7 +104,7 @@ To connect the new dataset to an existing project in the Speech Studio during it
104
104
}
105
105
```
106
106
107
-
Project URL required for `project` element can be obtained via[Get Projects](https://westeurope.dev.cognitive.microsoft.com/docs/services/speech-to-text-api-v3-0/operations/GetProjects) request.
107
+
The Project URL required for the `project` element can be obtained with the[Get Projects](https://westeurope.dev.cognitive.microsoft.com/docs/services/speech-to-text-api-v3-0/operations/GetProjects) request.
108
108
109
109
## Audio + human-labeled transcript data for training/testing
Copy file name to clipboardExpand all lines: articles/ddos-protection/test-through-simulations.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -31,7 +31,7 @@ We have partnered with [BreakingPoint Cloud](https://www.ixiacom.com/products/br
31
31
## Prerequisites
32
32
33
33
- Before you can complete the steps in this tutorial, you must first create a [Azure DDoS Standard protection plan](manage-ddos-protection.md) with protected public IP addresses.
34
-
- You must first create an account with [BreakingPoint Cloud](http://breakingpoint.cloud/).
34
+
- You must first create an account with [BreakingPoint Cloud](https://www.ixiacom.com/products/breakingpoint-cloud).
35
35
36
36
## Configure a DDoS test attack
37
37
@@ -62,7 +62,7 @@ Once the resource is under attack, you should see that the value changes from **
62
62
63
63
### BreakingPoint Cloud API Script
64
64
65
-
This [API script](https://aka.ms/ddosbreakingpoint) can be used to automate DDoS testing by running once or using cron to schedule regular tests. This is useful to validate that your logging is configured properly and that detection and response procedures are effective. The scripts require a Linux OS (tested with Ubuntu 18.04 LTS) and Python 3. Install prerequisites and API client using the included script or by using the documentation on the [BreakingPoint Cloud](http://breakingpoint.cloud/) website.
65
+
This [API script](https://aka.ms/ddosbreakingpoint) can be used to automate DDoS testing by running once or using cron to schedule regular tests. This is useful to validate that your logging is configured properly and that detection and response procedures are effective. The scripts require a Linux OS (tested with Ubuntu 18.04 LTS) and Python 3. Install prerequisites and API client using the included script or by using the documentation on the [BreakingPoint Cloud](https://www.ixiacom.com/products/breakingpoint-cloud) website.
0 commit comments