Skip to content

Commit bf9faf6

Browse files
authored
Merge pull request #229511 from MicrosoftDocs/main
Publish to live, Sunday 4 AM PST, 3/5
2 parents 53e6f1b + 34cb956 commit bf9faf6

36 files changed

+904
-731
lines changed

articles/azure-arc/kubernetes/tutorial-gitops-flux2-ci-cd.md

Lines changed: 179 additions & 153 deletions
Large diffs are not rendered by default.

articles/azure-monitor/best-practices-alerts.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -49,7 +49,7 @@ Each alert rule defines the severity of the alerts that it creates based on the
4949
| Sev 1 | Error | Degradation of performance or loss of availability of some aspect of an application or service. Requires attention but not immediate. |
5050
| Sev 2 | Warning | A problem that doesn't include any current loss in availability or performance, although it has the potential to lead to more severe problems if unaddressed. |
5151
| Sev 3 | Informational | Doesn't indicate a problem but provides interesting information to an operator, such as successful completion of a regular process. |
52-
| Sev 4 | Verbose | Detailed information that isn't useful.
52+
| Sev 4 | Verbose | Doesn't indicate a problem but provides detailed information that is verbose.
5353

5454
Assess the severity of the condition each rule is identifying to assign an appropriate level. Define the types of issues you assign to each severity level and your standard response to each in your alerts strategy.
5555

articles/azure-monitor/essentials/collect-custom-metrics-guestos-resource-manager-vm.md

Lines changed: 189 additions & 196 deletions
Large diffs are not rendered by default.

articles/azure-monitor/essentials/data-collection-rule-structure.md

Lines changed: 23 additions & 24 deletions
Original file line numberDiff line numberDiff line change
@@ -9,35 +9,37 @@ ms.reviwer: nikeist
99

1010
---
1111

12-
13-
1412
# Structure of a data collection rule in Azure Monitor (preview)
15-
[Data Collection Rules (DCRs)](data-collection-rule-overview.md) determine how to collect and process telemetry sent to Azure. Some data collection rules will be created and managed by Azure Monitor, while you may create others to customize data collection for your particular requirements. This article describes the structure of DCRs for creating and editing data collection rules in those cases where you need to work with them directly.
16-
13+
[Data collection rules (DCRs)](data-collection-rule-overview.md) determine how to collect and process telemetry sent to Azure. Some DCRs will be created and managed by Azure Monitor. You might create other DCRs to customize data collection for your particular requirements. This article describes the structure of DCRs for creating and editing DCRs in those cases where you need to work with them directly.
1714

1815
## Custom logs
19-
A DCR for [custom logs](../logs/logs-ingestion-api-overview.md) contains the sections below. For a sample, see [Sample data collection rule - custom logs](../logs/data-collection-rule-sample-custom-logs.md).
16+
A DCR for [custom logs](../logs/logs-ingestion-api-overview.md) contains the following sections. For a sample, see [Sample data collection rule - custom logs](../logs/data-collection-rule-sample-custom-logs.md).
2017

2118
### streamDeclarations
22-
This section contains the declaration of all the different types of data that will be sent via the HTTP endpoint directly into Log Analytics. Each stream is an object whose key represents the stream name (Must begin with *Custom-*) and whose value is the full list of top-level properties that the JSON data that will be sent will contain. Note that the shape of the data you send to the endpoint doesn't need to match that of the destination table. Rather, the output of the transform that is applied on top of the input data needs to match the destination shape. The possible data types that can be assigned to the properties are `string`, `int`, `long`, `real`, `boolean`, `dynamic`, and `datetime`.
19+
This section contains the declaration of all the different types of data that will be sent via the HTTP endpoint directly into Log Analytics. Each stream is an object whose:
20+
21+
- Key represents the stream name, which must begin with *Custom-*.
22+
- Value is the full list of top-level properties that are contained in the JSON data that will be sent.
23+
24+
The shape of the data you send to the endpoint doesn't need to match that of the destination table. Instead, the output of the transform that's applied on top of the input data needs to match the destination shape. The possible data types that can be assigned to the properties are `string`, `int`, `long`, `real`, `boolean`, `dynamic`, and `datetime`.
2325

2426
### destinations
25-
This section contains a declaration of all the destinations where the data will be sent. Only Log Analytics is currently supported as a destination. Each Log Analytics destination will require the full Workspace Resource ID, as well as a friendly name that will be used elsewhere in the DCR to refer to this workspace.
27+
This section contains a declaration of all the destinations where the data will be sent. Only Log Analytics is currently supported as a destination. Each Log Analytics destination requires the full workspace resource ID and a friendly name that will be used elsewhere in the DCR to refer to this workspace.
2628

2729
### dataFlows
28-
This section ties the other sections together. Defines the following for each stream declared in the `streamDeclarations` section:
30+
This section ties the other sections together. It defines the following properties for each stream declared in the `streamDeclarations` section:
2931

30-
- `destination` from the `destinations` section where the data will be sent.
31-
- `transformKql` which is the [transformation](data-collection-transformations.md) applied to the data that was sent in the input shape described in the `streamDeclarations` section to the shape of the target table.
32-
- `outputStream` section, which describes which table in the workspace specified under the `destination` property the data will be ingested into. The value of the outputStream will have the `Microsoft-[tableName]` shape when data is being ingested into a standard Log Analytics table, or `Custom-[tableName]` when ingesting data into a custom-created table. Only one destination is allowed per stream.
32+
- `destination` from the `destinations` section where the data will be sent.
33+
- `transformKql` section, which is the [transformation](data-collection-transformations.md) applied to the data that was sent in the input shape described in the `streamDeclarations` section to the shape of the target table.
34+
- `outputStream` section, which describes which table in the workspace specified under the `destination` property the data will be ingested into. The value of `outputStream` has the `Microsoft-[tableName]` shape when data is being ingested into a standard Log Analytics table, or `Custom-[tableName]` when ingesting data into a custom-created table. Only one destination is allowed per stream.
3335

34-
## Azure Monitor agent
35-
A DCR for [Azure Monitor agent](../agents/data-collection-rule-azure-monitor-agent.md) contains the sections below. For a sample, see [Sample data collection rule - agent](../agents/data-collection-rule-sample-agent.md).
36+
## Azure Monitor Agent
37+
A DCR for [Azure Monitor Agent](../agents/data-collection-rule-azure-monitor-agent.md) contains the following sections. For a sample, see [Sample data collection rule - agent](../agents/data-collection-rule-sample-agent.md).
3638

3739
### dataSources
38-
Unique source of monitoring data with its own format and method of exposing its data. Examples of a data source include Windows event log, performance counters, and syslog. Each data source matches a particular data source type as described below.
40+
This unique source of monitoring data has its own format and method of exposing its data. Examples of a data source include Windows event log, performance counters, and Syslog. Each data source matches a particular data source type as described in the following table.
3941

40-
Each data source has a data source type. Each type defines a unique set of properties that must be specified for each data source. The data source types currently available are shown in the following table.
42+
Each data source has a data source type. Each type defines a unique set of properties that must be specified for each data source. The data source types currently available appear in the following table.
4143

4244
| Data source type | Description |
4345
|:---|:---|
@@ -46,18 +48,15 @@ Each data source has a data source type. Each type defines a unique set of prope
4648
| syslog | Syslog events on Linux |
4749
| windowsEventLogs | Windows event log |
4850

49-
50-
### Streams
51-
Unique handle that describes a set of data sources that will be transformed and schematized as one type. Each data source requires one or more streams, and one stream may be used by multiple data sources. All data sources in a stream share a common schema. Use multiple streams for example, when you want to send a particular data source to multiple tables in the same Log Analytics workspace.
51+
### Streams
52+
This unique handle describes a set of data sources that will be transformed and schematized as one type. Each data source requires one or more streams, and one stream can be used by multiple data sources. All data sources in a stream share a common schema. Use multiple streams, for example, when you want to send a particular data source to multiple tables in the same Log Analytics workspace.
5253

5354
### destinations
54-
Set of destinations where the data should be sent. Examples include Log Analytics workspace and Azure Monitor Metrics. Multiple destinations are allowed for multi-homing scenario.
55-
56-
### dataFlows
57-
Definition of which streams should be sent to which destinations.
58-
55+
This set of destinations indicates where the data should be sent. Examples include Log Analytics workspace and Azure Monitor Metrics. Multiple destinations are allowed for multi-homing scenarios.
5956

57+
### dataFlows
58+
The definition indicates which streams should be sent to which destinations.
6059

6160
## Next steps
6261

63-
- [Overview of data collection rules including methods for creating them.](data-collection-rule-overview.md)
62+
[Overview of data collection rules and methods for creating them](data-collection-rule-overview.md)

0 commit comments

Comments
 (0)