Skip to content

Commit fed9e4b

Browse files
authored
Merge pull request #285577 from MicrosoftDocs/main
8/27 11:00 AM IST Publish
2 parents fe7bfd1 + d358f0b commit fed9e4b

File tree

78 files changed

+439
-109
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

78 files changed

+439
-109
lines changed

articles/azure-monitor/agents/data-collection-log-json.md

Lines changed: 6 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22
title: Collect logs from a JSON file with Azure Monitor Agent
33
description: Configure a data collection rule to collect log data from a JSON file on a virtual machine using Azure Monitor Agent.
44
ms.topic: conceptual
5-
ms.date: 07/12/2024
5+
ms.date: 08/23/2024
66
author: guywi-ms
77
ms.author: guywild
88
ms.reviewer: jeffwo
@@ -88,6 +88,10 @@ $tableParams = @'
8888
{
8989
"name": "FilePath",
9090
"type": "string"
91+
},
92+
{
93+
"name": "Computer",
94+
"type": "string"
9195
}
9296
]
9397
}
@@ -117,6 +121,7 @@ JSON files include a property name with each value, and the incoming stream in t
117121
|:---|:---|:---|
118122
| `TimeGenerated` | datetime | The time the record was generated. This value will be automatically populated with the time the record is added to the Log Analytics workspace if it's not included in the incoming stream. |
119123
| `FilePath` | string | If you add this column to the incoming stream in the DCR, it will be populated with the path to the log file. This column is not created automatically and can't be added using the portal. You must manually modify the DCR created by the portal or create the DCR using another method where you can explicitly define the incoming stream. |
124+
| `Computer` | string | If you add this column to the incoming stream in the DCR, it will be populated with the name of the computer with the log file. This column is not created automatically and can't be added using the portal. You must manually modify the DCR created by the portal or create the DCR using another method where you can explicitly define the incoming stream. |
120125

121126
### Transformation
122127
The [transformation](../essentials/data-collection-transformations.md) potentially modifies the incoming stream to filter records or to modify the schema to match the target table. If the schema of the incoming stream is the same as the target table, then you can use the default transformation of `source`. If not, then modify the `transformKql` section of tee ARM template with a KQL query that returns the required schema.

articles/azure-monitor/agents/data-collection-log-text.md

Lines changed: 12 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22
title: Collect logs from a text file with Azure Monitor Agent
33
description: Configure a data collection rule to collect log data from a text file on a virtual machine using Azure Monitor Agent.
44
ms.topic: conceptual
5-
ms.date: 07/12/2024
5+
ms.date: 08/23/2024
66
author: guywi-ms
77
ms.author: guywild
88
ms.reviewer: jeffwo
@@ -29,7 +29,7 @@ The following diagram shows the basic operation of collecting log data from a te
2929
4. If a custom transformation is used, the log entry can be parsed into multiple columns in the target table.
3030

3131

32-
:::image type="content" source="media/data-collection-log-text/text-log-collection.png" lightbox="media/data-collection-log-text/text-log-collection.png" alt-text="Diagram showing collection of a text log by the Azure Monitor agent, showing both simple collection and a transformation for a comma-delimited file.":::
32+
:::image type="content" source="media/data-collection-log-text/text-log-collection.png" lightbox="media/data-collection-log-text/text-log-collection.png" alt-text="Diagram showing collection of a text log by the Azure Monitor agent, showing both simple collection and a transformation for a comma-delimited file." border="false":::
3333

3434

3535
## Text file requirements and best practices
@@ -60,6 +60,7 @@ The incoming stream of data includes the columns in the following table.
6060
| `TimeGenerated` | datetime | The time the record was generated. This value will be automatically populated with the time the record is added to the Log Analytics workspace. You can override this value using a transformation to set `TimeGenerated` to another value. |
6161
| `RawData` | string | The entire log entry in a single column. You can use a transformation if you want to break down this data into multiple columns before sending to the table. |
6262
| `FilePath` | string | If you add this column to the incoming stream in the DCR, it will be populated with the path to the log file. This column is not created automatically and can't be added using the portal. You must manually modify the DCR created by the portal or create the DCR using another method where you can explicitly define the incoming stream. |
63+
| `Computer` | string | If you add this column to the incoming stream in the DCR, it will be populated with the name of the computer with the log file. This column is not created automatically and can't be added using the portal. You must manually modify the DCR created by the portal or create the DCR using another method where you can explicitly define the incoming stream. |
6364

6465

6566
## Custom table
@@ -69,7 +70,7 @@ Before you can collect log data from a text file, you must create a custom table
6970
> You shouldn’t use an existing custom log table used by MMA agents. Your MMA agents won't be able to write to the table once the first AMA agent writes to the table. You should create a new table for AMA to use to prevent MMA data loss.
7071
7172

72-
For example, you can use the following PowerShell script to create a custom table with `RawData` and `FilePath`. You wouldn't need a transformation for this table because the schema matches the default schema of the incoming stream.
73+
For example, you can use the following PowerShell script to create a custom table with `RawData`, `FilePath`, and `Computer`. You wouldn't need a transformation for this table because the schema matches the default schema of the incoming stream.
7374

7475

7576
```powershell
@@ -90,6 +91,10 @@ $tableParams = @'
9091
{
9192
"name": "FilePath",
9293
"type": "String"
94+
},
95+
{
96+
"name": "Computer",
97+
"type": "String"
9398
}
9499
]
95100
}
@@ -187,6 +192,10 @@ Use the following ARM template to create or modify a DCR for collecting text log
187192
{
188193
"name": "FilePath",
189194
"type": "string"
195+
},
196+
{
197+
"name": "Computer",
198+
"type": "string"
190199
}
191200
]
192201
}
-603 Bytes
Loading
4.77 KB
Loading

0 commit comments

Comments
 (0)