Skip to content

Commit d489131

Browse files
authored
Merge pull request #281867 from bwren/json-fix
AMA json log fixes
2 parents 92a21e2 + 8b298e2 commit d489131

File tree

2 files changed

+38
-42
lines changed

2 files changed

+38
-42
lines changed

articles/azure-monitor/agents/data-collection-log-json.md

Lines changed: 37 additions & 41 deletions
Original file line numberDiff line numberDiff line change
@@ -52,8 +52,8 @@ Adhere to the following recommendations to ensure that you don't experience data
5252
## Custom table
5353
Before you can collect log data from a JSON file, you must create a custom table in your Log Analytics workspace to receive the data. The table schema must match the columns in the incoming stream, or you must add a transformation to ensure that the output schema matches the table.
5454

55-
>
56-
> Warning: You shouldnt use an existing custom table used by MMA agents. Your MMA agents won't be able to write to the table once the first AMA agent writes to the table. You should create a new table for AMA to use to prevent MMA data loss.
55+
> [!WARNING]
56+
> You shouldn't use an existing custom table used by Log Analytics agent. The legacy agents won't be able to write to the table once the first Azure Monitor agent writes to it. Create a new table for Azure Monitor agent to use to prevent Log Analytics agent data loss.
5757
>
5858
5959
For example, you can use the following PowerShell script to create a custom table with multiple columns.
@@ -102,40 +102,35 @@ Invoke-AzRestMethod -Path "/subscriptions/{subscription}/resourcegroups/{resourc
102102
## Create a data collection rule for a JSON file
103103

104104
> [!NOTE]
105-
> The agent based JSON custom file ingestion is currently in preview and does not have a complete UI experience in the portal yet. While you can create the DCR using the portal, you must modify it to define the columns in the incoming stream. See the **Resource Manager template** tab for details on creating the required DCR.
105+
> The agent based JSON custom file ingestion is currently in preview and does not have a complete UI experience in the portal yet. While you can create the DCR using the portal, you must modify it to define the columns in the incoming stream. This section includes details on creating the DCR using an ARM template.
106+
107+
### Incoming stream schema
108+
JSON files include a property name with each value, and the incoming stream in the DCR needs to include a column matching the name of each property. You need to modify the `columns` section of the ARM template with the columns from your log.
106109

107-
### Incoming stream
108-
JSON files include a property name with each value, and the incoming stream in the DCR needs to include a column matching the name of each property. If you create the DCR using the Azure portal, the columns in the following table will be included in the incoming stream, and you must manually modify the DCR or create it using another method where you can explicitly define the incoming stream.
110+
The following table describes optional columns that you can include in addition to the columns defining the data in your log file.
109111

110112
| Column | Type | Description |
111113
|:---|:---|:---|
112-
| `TimeGenerated` | datetime | The time the record was generated. |
113-
| `RawData` | string | This column will be empty for a JSON log. |
114+
| `TimeGenerated` | datetime | The time the record was generated. This value will be automatically populated with the time the record is added to the Log Analytics workspace if it's not included in the incoming stream. |
114115
| `FilePath` | string | If you add this column to the incoming stream in the DCR, it will be populated with the path to the log file. This column is not created automatically and can't be added using the portal. You must manually modify the DCR created by the portal or create the DCR using another method where you can explicitly define the incoming stream. |
115116

116-
### [Portal](#tab/portal)
117+
### Transformation
118+
The [transformation](../essentials/data-collection-transformations.md) potentially modifies the incoming stream to filter records or to modify the schema to match the target table. If the schema of the incoming stream is the same as the target table, then you can use the default transformation of `source`. If not, then modify the `transformKql` section of tee ARM template with a KQL query that returns the required schema.
117119

118-
Create a data collection rule, as described in [Collect data with Azure Monitor Agent](./azure-monitor-agent-data-collection.md). In the **Collect and deliver** step, select **JSON Logs** from the **Data source type** dropdown.
120+
### ARM template
121+
122+
Use the following ARM template to create a DCR for collecting text log files, making the changes described in the previous sections. The following table describes the parameters that require values when you deploy the template.
119123

120124
| Setting | Description |
121125
|:---|:---|
122-
| File pattern | Identifies the location and name of log files on the local disk. Use a wildcard for filenames that vary, for example when a new file is created each day with a new name. You can enter multiple file patterns separated by commas.<br><br>Examples:<br>- C:\Logs\MyLog.json<br>- C:\Logs\MyLog*.json<br>- C:\App01\AppLog.json, C:\App02\AppLog.json<br>- /var/mylog.json<br>- /var/mylog*.json |
126+
| Data collection rule name | Unique name for the DCR. |
127+
| Location | Region for the DCR. Must be the same location as the Log Analytics workspace. |
128+
| File patterns | Identifies the location and name of log files on the local disk. Use a wildcard for filenames that vary, for example when a new file is created each day with a new name. You can enter multiple file patterns separated by commas (AMA version 1.26 or higher required for multiple file patterns on Linux).<br><br>Examples:<br>- C:\Logs\MyLog.json<br>- C:\Logs\MyLog*.json<br>- C:\App01\AppLog.json, C:\App02\AppLog.json<br>- /var/mylog.json<br>- /var/mylog*.json |
123129
| Table name | Name of the destination table in your Log Analytics Workspace. |
124-
| Record delimiter | Not currently used but reserved for future potential use allowing delimiters other than the currently supported end of line (`/r/n`). |
125-
| Transform | [Ingestion-time transformation](../essentials/data-collection-transformations.md) to filter records or to format the incoming data for the destination table. Use `source` to leave the incoming data unchanged. |
126-
127-
128-
129-
### [Resource Manager template](#tab/arm)
130-
131-
Use the following ARM template to create a DCR for collecting text log files. In addition to the parameter values, you may need to modify the following values in the template:
132-
133-
- `columns`: Modify with the list of columns in the JSON log that you're collecting.
134-
- `transformKql`: Modify the default transformation if the schema of the incoming stream doesn't match the schema of the target table. The output schema of the transformation must match the schema of the target table.
130+
| Workspace resource ID | Resource ID of the Log Analytics workspace with the target table. |
135131

136132
> [!IMPORTANT]
137-
> If you create the DCR using an ARM template, you still must associate the DCR with the agents that will use it. You can edit the DCR in the Azure portal and select the agents as described in [Add resources](./azure-monitor-agent-data-collection.md#add-resources)
138-
133+
> When you create the DCR using an ARM template, you still must associate the DCR with the agents that will use it. You can edit the DCR in the Azure portal and select the agents as described in [Add resources](./azure-monitor-agent-data-collection.md#add-resources)
139134
140135
```json
141136
{
@@ -145,51 +140,52 @@ Use the following ARM template to create a DCR for collecting text log files. In
145140
"dataCollectionRuleName": {
146141
"type": "string",
147142
"metadata": {
148-
"description": "Unique name for the DCR. "
143+
"description": "Unique name for the DCR. "
149144
}
150145
},
151146
"location": {
152147
"type": "string",
153148
"metadata": {
154-
"description": "Region for the DCR. Must be the same location as the Log Analytics workspace. "
149+
"description": "Region for the DCR. Must be the same location as the Log Analytics workspace. "
155150
}
156151
},
157152
"filePatterns": {
158153
"type": "string",
159154
"metadata": {
160-
"description": "Path on the local disk for the log file to collect. May include wildcards.Enter multiple file patterns separated by commas (AMA version 1.26 or higher required for multiple file patterns on Linux)."
155+
"description": "Path on the local disk for the log file to collect. May include wildcards.Enter multiple file patterns separated by commas (AMA version 1.26 or higher required for multiple file patterns on Linux)."
161156
}
162157
},
163158
"tableName": {
164159
"type": "string",
165160
"metadata": {
166-
"description": "Name of destination table in your Log Analytics workspace. "
161+
"description": "Name of destination table in your Log Analytics workspace. "
167162
}
168163
},
169164
"workspaceResourceId": {
170165
"type": "string",
171166
"metadata": {
172-
"description": "Resource ID of the Log Analytics workspace with the target table."
167+
"description": "Resource ID of the Log Analytics workspace with the target table."
173168
}
174169
},
175170
"dataCollectionEndpointResourceId": {
176-
"type": "string",
177-
"metadata": { "description": "Resource ID of the Data Collection Endpoint to be used with this rule."
178-
}
179-
}
171+
"type": "string",
172+
"metadata": {
173+
"description": "Resource ID of the Data Collection Endpoint to be used with this rule."
174+
}
175+
}
180176
},
181177
"variables": {
182-
"tableOutputStream": "[concat('Custom-', parameters('tableName'))]]"
178+
"tableOutputStream": "[concat('Custom-', parameters('tableName'))]"
183179
},
184180
"resources": [
185181
{
186182
"type": "Microsoft.Insights/dataCollectionRules",
187-
"name": "[parameters('dataCollectionRuleName')]",
188-
"location": "[parameters('location')]",
189183
"apiVersion": "2022-06-01",
184+
"name": "[parameters('dataCollectionRuleName')]",
185+
"location": "[parameters('location')]",
190186
"properties": {
191187
"streamDeclarations": {
192-
"Custom-JSONLog-stream": {
188+
"Custom-Json-stream": {
193189
"columns": [
194190
{
195191
"name": "TimeGenerated",
@@ -228,31 +224,31 @@ Use the following ARM template to create a DCR for collecting text log files. In
228224
"[parameters('filePatterns')]"
229225
],
230226
"format": "json",
231-
"name": "Custom-Json-dataSource"
227+
"name": "Custom-Json-stream"
232228
}
233229
]
234230
},
235231
"destinations": {
236232
"logAnalytics": [
237233
{
238-
"workspaceResourceId": "[parameters('workspaceResourceId')]",
234+
"workspaceResourceId": "[parameters('workspaceResourceId')]",
239235
"name": "workspace"
240236
}
241237
]
242238
},
243239
"dataFlows": [
244240
{
245241
"streams": [
246-
"Custom-JSONLog-stream"
242+
"Custom-Json-stream"
247243
],
248244
"destinations": [
249245
"workspace"
250246
],
251247
"transformKql": "source",
252248
"outputStream": "[variables('tableOutputStream')]"
253249
}
254-
]
255-
"dataCollectionEndpointId" : "[parameters('dataCollectionEndpointResourceId')]"
250+
],
251+
"dataCollectionEndpointId": "[parameters('dataCollectionEndpointResourceId')]"
256252
}
257253
}
258254
]

articles/azure-monitor/toc.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -709,7 +709,7 @@ items:
709709
- name: SNMP traps
710710
displayName: data collection rule,Azure Monitor agent
711711
href: agents/data-collection-snmp-data.md
712-
- name: Collect Windows Firewall logs
712+
- name: Windows Firewall logs
713713
displayName: data collection rule,Azure Monitor Agent,firewall logs
714714
href: agents/data-sources-firewall-logs.md
715715
- name: Configure a syslog forwarder

0 commit comments

Comments
 (0)