Skip to content

Commit 81aa932

Browse files
committed
AMA json log fixes
1 parent b1ec5c1 commit 81aa932

File tree

2 files changed

+49
-48
lines changed

2 files changed

+49
-48
lines changed

articles/azure-monitor/agents/data-collection-log-json.md

Lines changed: 48 additions & 47 deletions
Original file line numberDiff line numberDiff line change
@@ -106,101 +106,101 @@ Invoke-AzRestMethod -Path "/subscriptions/{subscription}/resourcegroups/{resourc
106106
## Create a data collection rule for a JSON file
107107

108108
> [!NOTE]
109-
> The agent based JSON custom file ingestion is currently in preview and does not have a complete UI experience in the portal yet. While you can create the DCR using the portal, you must modify it to define the columns in the incoming stream. See the **Resource Manager template** tab for details on creating the required DCR.
109+
> The agent based JSON custom file ingestion is currently in preview and does not have a complete UI experience in the portal yet. While you can create the DCR using the portal, you must modify it to define the columns in the incoming stream. This section includes details on creating the DCR using an ARM template.
110110
111-
### Incoming stream
112-
JSON files include a property name with each value, and the incoming stream in the DCR needs to include a column matching the name of each property. If you create the DCR using the Azure portal, the columns in the following table will be included in the incoming stream, and you must manually modify the DCR or create it using another method where you can explicitly define the incoming stream.
111+
### Incoming stream schema
112+
113+
JSON files include a property name with each value, and the incoming stream in the DCR needs to include a column matching the name of each property. The following table describes optional columns that you can include in addition to the columns defining the data in your log file.
113114

114115
| Column | Type | Description |
115116
|:---|:---|:---|
116-
| `TimeGenerated` | datetime | The time the record was generated. |
117-
| `RawData` | string | This column will be empty for a JSON log. |
117+
| `TimeGenerated` | datetime | The time the record was generated. This value will be automatically populated with the time the record is added to the Log Analytics workspace if it's not included in the incoming stream. |
118118
| `FilePath` | string | If you add this column to the incoming stream in the DCR, it will be populated with the path to the log file. This column is not created automatically and can't be added using the portal. You must manually modify the DCR created by the portal or create the DCR using another method where you can explicitly define the incoming stream. |
119-
| `Computer` | string | If you add this column to the incoming stream in the DCR, it will be populated with the name of the computer. This column is not created automatically and can't be added using the portal. You must manually modify the DCR created by the portal or create the DCR using another method where you can explicitly define the incoming stream. |
120119

121-
### [Portal](#tab/portal)
120+
Use the following ARM template to create a DCR for collecting text log files. You may need to modify the following values in the template itself:
121+
122+
- `columns`: Modify with the list of columns in the JSON log that you're collecting.
123+
- `transformKql`: Modify the default transformation if the schema of the incoming stream doesn't match the schema of the target table. The output schema of the [transformation](../essentials/data-collection-transformations.md) must match the schema of the target table.
122124

123-
Create a data collection rule, as described in [Collect data with Azure Monitor Agent](./azure-monitor-agent-data-collection.md). In the **Collect and deliver** step, select **JSON Logs** from the **Data source type** dropdown.
125+
The following table describes the parameters.
124126

125127
| Setting | Description |
126128
|:---|:---|
127-
| File pattern | Identifies the location and name of log files on the local disk. Use a wildcard for filenames that vary, for example when a new file is created each day with a new name. You can enter multiple file patterns separated by commas.<br><br>Examples:<br>- C:\Logs\MyLog.json<br>- C:\Logs\MyLog*.json<br>- C:\App01\AppLog.json, C:\App02\AppLog.json<br>- /var/mylog.json<br>- /var/mylog*.json |
129+
| Data collection rule name | Unique name for the DCR. |
130+
| Location | Region for the DCR. Must be the same location as the Log Analytics workspace. |
131+
| File patterns | Identifies the location and name of log files on the local disk. Use a wildcard for filenames that vary, for example when a new file is created each day with a new name. You can enter multiple file patterns separated by commas (AMA version 1.26 or higher required for multiple file patterns on Linux).<br><br>Examples:<br>- C:\Logs\MyLog.json<br>- C:\Logs\MyLog*.json<br>- C:\App01\AppLog.json, C:\App02\AppLog.json<br>- /var/mylog.json<br>- /var/mylog*.json |
128132
| Table name | Name of the destination table in your Log Analytics Workspace. |
129-
| Record delimiter | Not currently used but reserved for future potential use allowing delimiters other than the currently supported end of line (`/r/n`). |
130-
| Transform | [Ingestion-time transformation](../essentials/data-collection-transformations.md) to filter records or to format the incoming data for the destination table. Use `source` to leave the incoming data unchanged. |
131-
132-
133-
134-
### [Resource Manager template](#tab/arm)
135-
136-
Use the following ARM template to create a DCR for collecting text log files. In addition to the parameter values, you may need to modify the following values in the template:
137-
138-
- `columns`: Modify with the list of columns in the JSON log that you're collecting.
139-
- `transformKql`: Modify the default transformation if the schema of the incoming stream doesn't match the schema of the target table. The output schema of the transformation must match the schema of the target table.
133+
| Workspace resource ID | Resource ID of the Log Analytics workspace with the target table. |
140134

141135
> [!IMPORTANT]
142-
> If you create the DCR using an ARM template, you still must associate the DCR with the agents that will use it. You can edit the DCR in the Azure portal and select the agents as described in [Add resources](./azure-monitor-agent-data-collection.md#add-resources)
143-
136+
> When you create the DCR using an ARM template, you still must associate the DCR with the agents that will use it. You can edit the DCR in the Azure portal and select the agents as described in [Add resources](./azure-monitor-agent-data-collection.md#add-resources)
144137
145138
```json
146139
{
147140
"$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
148141
"contentVersion": "1.0.0.0",
149142
"parameters": {
150143
"dataCollectionRuleName": {
151-
"type": "string",
144+
"type": "String",
152145
"metadata": {
153-
"description": "Unique name for the DCR. "
154-
},
146+
"description": "Unique name for the DCR. "
147+
}
155148
},
156149
"location": {
157-
"type": "string",
150+
"type": "String",
158151
"metadata": {
159-
"description": "Region for the DCR. Must be the same location as the Log Analytics workspace. "
152+
"description": "Region for the DCR. Must be the same location as the Log Analytics workspace. "
153+
}
160154
},
161155
"filePatterns": {
162-
"type": "string",
156+
"type": "String",
163157
"metadata": {
164-
"description": "Path on the local disk for the log file to collect. May include wildcards.Enter multiple file patterns separated by commas (AMA version 1.26 or higher required for multiple file patterns on Linux)."
165-
},
158+
"description": "Path on the local disk for the log file to collect. May include wildcards.Enter multiple file patterns separated by commas (AMA version 1.26 or higher required for multiple file patterns on Linux)."
159+
}
166160
},
167161
"tableName": {
168-
"type": "string",
162+
"type": "String",
169163
"metadata": {
170-
"description": "Name of destination table in your Log Analytics workspace. "
171-
},
164+
"description": "Name of destination table in your Log Analytics workspace. "
165+
}
172166
},
173167
"workspaceResourceId": {
174-
"type": "string",
168+
"type": "String",
169+
"metadata": {
170+
"description": "Resource ID of the Log Analytics workspace with the target table."
171+
}
172+
},
173+
"dataCollectionEndpointResourceId": {
174+
"type": "String",
175175
"metadata": {
176-
"description": "Resource ID of the Log Analytics workspace with the target table."
177-
},
176+
"description": "Resource ID of the Data Collection Endpoint to be used with this rule."
177+
}
178178
}
179179
},
180180
"variables": {
181-
"tableOutputStream": "['Custom-',concat(parameters('tableName'))]"
181+
"tableOutputStream": "[concat('Custom-', parameters('tableName'))]"
182182
},
183183
"resources": [
184184
{
185185
"type": "Microsoft.Insights/dataCollectionRules",
186-
"name": "[parameters('dataCollectionRuleName')]",
187-
"location": "[parameters('location')]",
188186
"apiVersion": "2022-06-01",
187+
"name": "[parameters('dataCollectionRuleName')]",
188+
"location": "[parameters('location')]",
189189
"properties": {
190190
"streamDeclarations": {
191-
"Custom-JSONLog-stream": {
191+
"Custom-Json-stream": {
192192
"columns": [
193193
{
194194
"name": "TimeGenerated",
195195
"type": "datetime"
196196
},
197197
{
198198
"name": "FilePath",
199-
"type": "String"
199+
"type": "string"
200200
},
201201
{
202202
"name": "Computer",
203-
"type": "String"
203+
"type": "string"
204204
},
205205
{
206206
"name": "MyStringColumn",
@@ -216,7 +216,7 @@ Use the following ARM template to create a DCR for collecting text log files. In
216216
},
217217
{
218218
"name": "MyBooleanColumn",
219-
"type": "bool"
219+
"type": "boolean"
220220
}
221221
]
222222
}
@@ -225,7 +225,7 @@ Use the following ARM template to create a DCR for collecting text log files. In
225225
"logFiles": [
226226
{
227227
"streams": [
228-
"Custom-Json-stream"
228+
"Custom-JSONLog-stream"
229229
],
230230
"filePatterns": [
231231
"[parameters('filePatterns')]"
@@ -238,23 +238,24 @@ Use the following ARM template to create a DCR for collecting text log files. In
238238
"destinations": {
239239
"logAnalytics": [
240240
{
241-
"workspaceResourceId": "[parameters('workspaceResourceId')]",
241+
"workspaceResourceId": "[parameters('workspaceResourceId')]",
242242
"name": "workspace"
243243
}
244244
]
245245
},
246246
"dataFlows": [
247247
{
248248
"streams": [
249-
"Custom-Json-dataSource"
249+
"Custom-Json-stream"
250250
],
251251
"destinations": [
252252
"workspace"
253253
],
254254
"transformKql": "source",
255255
"outputStream": "[variables('tableOutputStream')]"
256256
}
257-
]
257+
],
258+
"dataCollectionEndpointId": "[parameters('dataCollectionEndpointResourceId')]"
258259
}
259260
}
260261
]

articles/azure-monitor/toc.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -709,7 +709,7 @@ items:
709709
- name: SNMP traps
710710
displayName: data collection rule,Azure Monitor agent
711711
href: agents/data-collection-snmp-data.md
712-
- name: Collect Windows Firewall logs
712+
- name: Windows Firewall logs
713713
displayName: data collection rule,Azure Monitor Agent,firewall logs
714714
href: agents/data-sources-firewall-logs.md
715715
- name: Configure a syslog forwarder

0 commit comments

Comments
 (0)