You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/azure-monitor/best-practices-alerts.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -49,7 +49,7 @@ Each alert rule defines the severity of the alerts that it creates based on the
49
49
| Sev 1 | Error | Degradation of performance or loss of availability of some aspect of an application or service. Requires attention but not immediate. |
50
50
| Sev 2 | Warning | A problem that doesn't include any current loss in availability or performance, although it has the potential to lead to more severe problems if unaddressed. |
51
51
| Sev 3 | Informational | Doesn't indicate a problem but provides interesting information to an operator, such as successful completion of a regular process. |
52
-
| Sev 4 | Verbose | Detailed information that isn't useful.
52
+
| Sev 4 | Verbose | Doesn't indicate a problem but provides detailed information that is verbose.
53
53
54
54
Assess the severity of the condition each rule is identifying to assign an appropriate level. Define the types of issues you assign to each severity level and your standard response to each in your alerts strategy.
Copy file name to clipboardExpand all lines: articles/azure-monitor/essentials/data-collection-rule-structure.md
+23-24Lines changed: 23 additions & 24 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -9,35 +9,37 @@ ms.reviwer: nikeist
9
9
10
10
---
11
11
12
-
13
-
14
12
# Structure of a data collection rule in Azure Monitor (preview)
15
-
[Data Collection Rules (DCRs)](data-collection-rule-overview.md) determine how to collect and process telemetry sent to Azure. Some data collection rules will be created and managed by Azure Monitor, while you may create others to customize data collection for your particular requirements. This article describes the structure of DCRs for creating and editing data collection rules in those cases where you need to work with them directly.
16
-
13
+
[Data collection rules (DCRs)](data-collection-rule-overview.md) determine how to collect and process telemetry sent to Azure. Some DCRs will be created and managed by Azure Monitor. You might create other DCRs to customize data collection for your particular requirements. This article describes the structure of DCRs for creating and editing DCRs in those cases where you need to work with them directly.
17
14
18
15
## Custom logs
19
-
A DCR for [custom logs](../logs/logs-ingestion-api-overview.md) contains the sections below. For a sample, see [Sample data collection rule - custom logs](../logs/data-collection-rule-sample-custom-logs.md).
16
+
A DCR for [custom logs](../logs/logs-ingestion-api-overview.md) contains the following sections. For a sample, see [Sample data collection rule - custom logs](../logs/data-collection-rule-sample-custom-logs.md).
20
17
21
18
### streamDeclarations
22
-
This section contains the declaration of all the different types of data that will be sent via the HTTP endpoint directly into Log Analytics. Each stream is an object whose key represents the stream name (Must begin with *Custom-*) and whose value is the full list of top-level properties that the JSON data that will be sent will contain. Note that the shape of the data you send to the endpoint doesn't need to match that of the destination table. Rather, the output of the transform that is applied on top of the input data needs to match the destination shape. The possible data types that can be assigned to the properties are `string`, `int`, `long`, `real`, `boolean`, `dynamic`, and `datetime`.
19
+
This section contains the declaration of all the different types of data that will be sent via the HTTP endpoint directly into Log Analytics. Each stream is an object whose:
20
+
21
+
- Key represents the stream name, which must begin with *Custom-*.
22
+
- Value is the full list of top-level properties that are contained in the JSON data that will be sent.
23
+
24
+
The shape of the data you send to the endpoint doesn't need to match that of the destination table. Instead, the output of the transform that's applied on top of the input data needs to match the destination shape. The possible data types that can be assigned to the properties are `string`, `int`, `long`, `real`, `boolean`, `dynamic`, and `datetime`.
23
25
24
26
### destinations
25
-
This section contains a declaration of all the destinations where the data will be sent. Only Log Analytics is currently supported as a destination. Each Log Analytics destination will require the full Workspace Resource ID, as well as a friendly name that will be used elsewhere in the DCR to refer to this workspace.
27
+
This section contains a declaration of all the destinations where the data will be sent. Only Log Analytics is currently supported as a destination. Each Log Analytics destination requires the full workspace resource ID and a friendly name that will be used elsewhere in the DCR to refer to this workspace.
26
28
27
29
### dataFlows
28
-
This section ties the other sections together. Defines the following for each stream declared in the `streamDeclarations` section:
30
+
This section ties the other sections together. It defines the following properties for each stream declared in the `streamDeclarations` section:
29
31
30
-
-`destination` from the `destinations` section where the data will be sent.
31
-
-`transformKql` which is the [transformation](data-collection-transformations.md) applied to the data that was sent in the input shape described in the `streamDeclarations` section to the shape of the target table.
32
-
-`outputStream` section, which describes which table in the workspace specified under the `destination` property the data will be ingested into. The value of the outputStream will have the `Microsoft-[tableName]` shape when data is being ingested into a standard Log Analytics table, or `Custom-[tableName]` when ingesting data into a custom-created table. Only one destination is allowed per stream.
32
+
-`destination` from the `destinations` section where the data will be sent.
33
+
-`transformKql`section, which is the [transformation](data-collection-transformations.md) applied to the data that was sent in the input shape described in the `streamDeclarations` section to the shape of the target table.
34
+
-`outputStream` section, which describes which table in the workspace specified under the `destination` property the data will be ingested into. The value of `outputStream` has the `Microsoft-[tableName]` shape when data is being ingested into a standard Log Analytics table, or `Custom-[tableName]` when ingesting data into a custom-created table. Only one destination is allowed per stream.
33
35
34
-
## Azure Monitor agent
35
-
A DCR for [Azure Monitor agent](../agents/data-collection-rule-azure-monitor-agent.md) contains the sections below. For a sample, see [Sample data collection rule - agent](../agents/data-collection-rule-sample-agent.md).
36
+
## Azure Monitor Agent
37
+
A DCR for [Azure Monitor Agent](../agents/data-collection-rule-azure-monitor-agent.md) contains the following sections. For a sample, see [Sample data collection rule - agent](../agents/data-collection-rule-sample-agent.md).
36
38
37
39
### dataSources
38
-
Unique source of monitoring data with its own format and method of exposing its data. Examples of a data source include Windows event log, performance counters, and syslog. Each data source matches a particular data source type as described below.
40
+
This unique source of monitoring data has its own format and method of exposing its data. Examples of a data source include Windows event log, performance counters, and Syslog. Each data source matches a particular data source type as described in the following table.
39
41
40
-
Each data source has a data source type. Each type defines a unique set of properties that must be specified for each data source. The data source types currently available are shown in the following table.
42
+
Each data source has a data source type. Each type defines a unique set of properties that must be specified for each data source. The data source types currently available appear in the following table.
41
43
42
44
| Data source type | Description |
43
45
|:---|:---|
@@ -46,18 +48,15 @@ Each data source has a data source type. Each type defines a unique set of prope
46
48
| syslog | Syslog events on Linux |
47
49
| windowsEventLogs | Windows event log |
48
50
49
-
50
-
### Streams
51
-
Unique handle that describes a set of data sources that will be transformed and schematized as one type. Each data source requires one or more streams, and one stream may be used by multiple data sources. All data sources in a stream share a common schema. Use multiple streams for example, when you want to send a particular data source to multiple tables in the same Log Analytics workspace.
51
+
### Streams
52
+
This unique handle describes a set of data sources that will be transformed and schematized as one type. Each data source requires one or more streams, and one stream can be used by multiple data sources. All data sources in a stream share a common schema. Use multiple streams, for example, when you want to send a particular data source to multiple tables in the same Log Analytics workspace.
52
53
53
54
### destinations
54
-
Set of destinations where the data should be sent. Examples include Log Analytics workspace and Azure Monitor Metrics. Multiple destinations are allowed for multi-homing scenario.
55
-
56
-
### dataFlows
57
-
Definition of which streams should be sent to which destinations.
58
-
55
+
This set of destinations indicates where the data should be sent. Examples include Log Analytics workspace and Azure Monitor Metrics. Multiple destinations are allowed for multi-homing scenarios.
59
56
57
+
### dataFlows
58
+
The definition indicates which streams should be sent to which destinations.
60
59
61
60
## Next steps
62
61
63
-
-[Overview of data collection rules including methods for creating them.](data-collection-rule-overview.md)
62
+
[Overview of data collection rules and methods for creating them](data-collection-rule-overview.md)
0 commit comments