You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/azure-monitor/agents/data-collection-log-json.md
+6-1Lines changed: 6 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -2,7 +2,7 @@
2
2
title: Collect logs from a JSON file with Azure Monitor Agent
3
3
description: Configure a data collection rule to collect log data from a JSON file on a virtual machine using Azure Monitor Agent.
4
4
ms.topic: conceptual
5
-
ms.date: 07/12/2024
5
+
ms.date: 08/23/2024
6
6
author: guywi-ms
7
7
ms.author: guywild
8
8
ms.reviewer: jeffwo
@@ -88,6 +88,10 @@ $tableParams = @'
88
88
{
89
89
"name": "FilePath",
90
90
"type": "string"
91
+
},
92
+
{
93
+
"name": "Computer",
94
+
"type": "string"
91
95
}
92
96
]
93
97
}
@@ -117,6 +121,7 @@ JSON files include a property name with each value, and the incoming stream in t
117
121
|:---|:---|:---|
118
122
|`TimeGenerated`| datetime | The time the record was generated. This value will be automatically populated with the time the record is added to the Log Analytics workspace if it's not included in the incoming stream. |
119
123
|`FilePath`| string | If you add this column to the incoming stream in the DCR, it will be populated with the path to the log file. This column is not created automatically and can't be added using the portal. You must manually modify the DCR created by the portal or create the DCR using another method where you can explicitly define the incoming stream. |
124
+
|`Computer`| string | If you add this column to the incoming stream in the DCR, it will be populated with the name of the computer with the log file. This column is not created automatically and can't be added using the portal. You must manually modify the DCR created by the portal or create the DCR using another method where you can explicitly define the incoming stream. |
120
125
121
126
### Transformation
122
127
The [transformation](../essentials/data-collection-transformations.md) potentially modifies the incoming stream to filter records or to modify the schema to match the target table. If the schema of the incoming stream is the same as the target table, then you can use the default transformation of `source`. If not, then modify the `transformKql` section of tee ARM template with a KQL query that returns the required schema.
Copy file name to clipboardExpand all lines: articles/azure-monitor/agents/data-collection-log-text.md
+12-3Lines changed: 12 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -2,7 +2,7 @@
2
2
title: Collect logs from a text file with Azure Monitor Agent
3
3
description: Configure a data collection rule to collect log data from a text file on a virtual machine using Azure Monitor Agent.
4
4
ms.topic: conceptual
5
-
ms.date: 07/12/2024
5
+
ms.date: 08/23/2024
6
6
author: guywi-ms
7
7
ms.author: guywild
8
8
ms.reviewer: jeffwo
@@ -29,7 +29,7 @@ The following diagram shows the basic operation of collecting log data from a te
29
29
4. If a custom transformation is used, the log entry can be parsed into multiple columns in the target table.
30
30
31
31
32
-
:::image type="content" source="media/data-collection-log-text/text-log-collection.png" lightbox="media/data-collection-log-text/text-log-collection.png" alt-text="Diagram showing collection of a text log by the Azure Monitor agent, showing both simple collection and a transformation for a comma-delimited file.":::
32
+
:::image type="content" source="media/data-collection-log-text/text-log-collection.png" lightbox="media/data-collection-log-text/text-log-collection.png" alt-text="Diagram showing collection of a text log by the Azure Monitor agent, showing both simple collection and a transformation for a comma-delimited file." border="false":::
33
33
34
34
35
35
## Text file requirements and best practices
@@ -60,6 +60,7 @@ The incoming stream of data includes the columns in the following table.
60
60
|`TimeGenerated`| datetime | The time the record was generated. This value will be automatically populated with the time the record is added to the Log Analytics workspace. You can override this value using a transformation to set `TimeGenerated` to another value. |
61
61
|`RawData`| string | The entire log entry in a single column. You can use a transformation if you want to break down this data into multiple columns before sending to the table. |
62
62
|`FilePath`| string | If you add this column to the incoming stream in the DCR, it will be populated with the path to the log file. This column is not created automatically and can't be added using the portal. You must manually modify the DCR created by the portal or create the DCR using another method where you can explicitly define the incoming stream. |
63
+
|`Computer`| string | If you add this column to the incoming stream in the DCR, it will be populated with the name of the computer with the log file. This column is not created automatically and can't be added using the portal. You must manually modify the DCR created by the portal or create the DCR using another method where you can explicitly define the incoming stream. |
63
64
64
65
65
66
## Custom table
@@ -69,7 +70,7 @@ Before you can collect log data from a text file, you must create a custom table
69
70
> You shouldn’t use an existing custom log table used by MMA agents. Your MMA agents won't be able to write to the table once the first AMA agent writes to the table. You should create a new table for AMA to use to prevent MMA data loss.
70
71
71
72
72
-
For example, you can use the following PowerShell script to create a custom table with `RawData`and `FilePath`. You wouldn't need a transformation for this table because the schema matches the default schema of the incoming stream.
73
+
For example, you can use the following PowerShell script to create a custom table with `RawData`, `FilePath`, and `Computer`. You wouldn't need a transformation for this table because the schema matches the default schema of the incoming stream.
73
74
74
75
75
76
```powershell
@@ -90,6 +91,10 @@ $tableParams = @'
90
91
{
91
92
"name": "FilePath",
92
93
"type": "String"
94
+
},
95
+
{
96
+
"name": "Computer",
97
+
"type": "String"
93
98
}
94
99
]
95
100
}
@@ -187,6 +192,10 @@ Use the following ARM template to create or modify a DCR for collecting text log
0 commit comments