Skip to content

Commit 4ac3234

Browse files
committed
fixes
1 parent a54a0bc commit 4ac3234

File tree

1 file changed

+32
-27
lines changed

1 file changed

+32
-27
lines changed

articles/azure-monitor/agents/data-collection-log-json.md

Lines changed: 32 additions & 27 deletions
Original file line numberDiff line numberDiff line change
@@ -52,8 +52,8 @@ Adhere to the following recommendations to ensure that you don't experience data
5252
## Custom table
5353
Before you can collect log data from a JSON file, you must create a custom table in your Log Analytics workspace to receive the data. The table schema must match the columns in the incoming stream, or you must add a transformation to ensure that the output schema matches the table.
5454

55-
> [!WARNING]
56-
> You shouldn't use an existing custom table used by Log Analytics agent. The legacy agents won't be able to write to the table once the first Azure Monitor agent writes to it. Create a new table for Azure Monitor agent to use to prevent Log Analytics agent data loss.
55+
>
56+
> Warning: You shouldnt use an existing custom table used by MMA agents. Your MMA agents won't be able to write to the table once the first AMA agent writes to the table. You should create a new table for AMA to use to prevent MMA data loss.
5757
>
5858
5959
For example, you can use the following PowerShell script to create a custom table with multiple columns.
@@ -102,35 +102,40 @@ Invoke-AzRestMethod -Path "/subscriptions/{subscription}/resourcegroups/{resourc
102102
## Create a data collection rule for a JSON file
103103

104104
> [!NOTE]
105-
> The agent based JSON custom file ingestion is currently in preview and does not have a complete UI experience in the portal yet. While you can create the DCR using the portal, you must modify it to define the columns in the incoming stream. This section includes details on creating the DCR using an ARM template.
106-
107-
### Incoming stream schema
108-
JSON files include a property name with each value, and the incoming stream in the DCR needs to include a column matching the name of each property. You need to modify the `columns` section of the ARM template with the columns from your log.
105+
> The agent based JSON custom file ingestion is currently in preview and does not have a complete UI experience in the portal yet. While you can create the DCR using the portal, you must modify it to define the columns in the incoming stream. See the **Resource Manager template** tab for details on creating the required DCR.
109106
110-
The following table describes optional columns that you can include in addition to the columns defining the data in your log file.
107+
### Incoming stream
108+
JSON files include a property name with each value, and the incoming stream in the DCR needs to include a column matching the name of each property. If you create the DCR using the Azure portal, the columns in the following table will be included in the incoming stream, and you must manually modify the DCR or create it using another method where you can explicitly define the incoming stream.
111109

112110
| Column | Type | Description |
113111
|:---|:---|:---|
114-
| `TimeGenerated` | datetime | The time the record was generated. This value will be automatically populated with the time the record is added to the Log Analytics workspace if it's not included in the incoming stream. |
112+
| `TimeGenerated` | datetime | The time the record was generated. |
113+
| `RawData` | string | This column will be empty for a JSON log. |
115114
| `FilePath` | string | If you add this column to the incoming stream in the DCR, it will be populated with the path to the log file. This column is not created automatically and can't be added using the portal. You must manually modify the DCR created by the portal or create the DCR using another method where you can explicitly define the incoming stream. |
116115

117-
### Transformation
118-
The [transformation](../essentials/data-collection-transformations.md) potentially modifies the incoming stream to filter records or to modify the schema to match the target table. If the schema of the incoming stream is the same as the target table, then you can use the default transformation of `source`. If not, then modify the `transformKql` section of tee ARM template with a KQL query that returns the required schema.
116+
### [Portal](#tab/portal)
119117

120-
### ARM template
121-
122-
Use the following ARM template to create a DCR for collecting text log files, making the changes described in the previous sections. The following table describes the parameters that require values when you deploy the template.
118+
Create a data collection rule, as described in [Collect data with Azure Monitor Agent](./azure-monitor-agent-data-collection.md). In the **Collect and deliver** step, select **JSON Logs** from the **Data source type** dropdown.
123119

124120
| Setting | Description |
125121
|:---|:---|
126-
| Data collection rule name | Unique name for the DCR. |
127-
| Location | Region for the DCR. Must be the same location as the Log Analytics workspace. |
128-
| File patterns | Identifies the location and name of log files on the local disk. Use a wildcard for filenames that vary, for example when a new file is created each day with a new name. You can enter multiple file patterns separated by commas (AMA version 1.26 or higher required for multiple file patterns on Linux).<br><br>Examples:<br>- C:\Logs\MyLog.json<br>- C:\Logs\MyLog*.json<br>- C:\App01\AppLog.json, C:\App02\AppLog.json<br>- /var/mylog.json<br>- /var/mylog*.json |
122+
| File pattern | Identifies the location and name of log files on the local disk. Use a wildcard for filenames that vary, for example when a new file is created each day with a new name. You can enter multiple file patterns separated by commas.<br><br>Examples:<br>- C:\Logs\MyLog.json<br>- C:\Logs\MyLog*.json<br>- C:\App01\AppLog.json, C:\App02\AppLog.json<br>- /var/mylog.json<br>- /var/mylog*.json |
129123
| Table name | Name of the destination table in your Log Analytics Workspace. |
130-
| Workspace resource ID | Resource ID of the Log Analytics workspace with the target table. |
124+
| Record delimiter | Not currently used but reserved for future potential use allowing delimiters other than the currently supported end of line (`/r/n`). |
125+
| Transform | [Ingestion-time transformation](../essentials/data-collection-transformations.md) to filter records or to format the incoming data for the destination table. Use `source` to leave the incoming data unchanged. |
126+
127+
128+
129+
### [Resource Manager template](#tab/arm)
130+
131+
Use the following ARM template to create a DCR for collecting text log files. In addition to the parameter values, you may need to modify the following values in the template:
132+
133+
- `columns`: Modify with the list of columns in the JSON log that you're collecting.
134+
- `transformKql`: Modify the default transformation if the schema of the incoming stream doesn't match the schema of the target table. The output schema of the transformation must match the schema of the target table.
131135

132136
> [!IMPORTANT]
133-
> When you create the DCR using an ARM template, you still must associate the DCR with the agents that will use it. You can edit the DCR in the Azure portal and select the agents as described in [Add resources](./azure-monitor-agent-data-collection.md#add-resources)
137+
> If you create the DCR using an ARM template, you still must associate the DCR with the agents that will use it. You can edit the DCR in the Azure portal and select the agents as described in [Add resources](./azure-monitor-agent-data-collection.md#add-resources)
138+
134139

135140
```json
136141
{
@@ -168,21 +173,21 @@ Use the following ARM template to create a DCR for collecting text log files, ma
168173
}
169174
},
170175
"dataCollectionEndpointResourceId": {
171-
"type": "string",
172-
"metadata": {
176+
"type": "string",
177+
"metadata": {
173178
"description": "Resource ID of the Data Collection Endpoint to be used with this rule."
174-
}
175-
}
179+
}
180+
}
176181
},
177182
"variables": {
178-
"tableOutputStream": "[concat('Custom-', parameters('tableName'))]"
183+
"tableOutputStream": "[concat('Custom-', parameters('tableName'))]]"
179184
},
180185
"resources": [
181186
{
182187
"type": "Microsoft.Insights/dataCollectionRules",
183-
"apiVersion": "2022-06-01",
184188
"name": "[parameters('dataCollectionRuleName')]",
185-
"location": "[parameters('location')]",
189+
"location": "[parameters('location')]",
190+
"apiVersion": "2022-06-01",
186191
"properties": {
187192
"streamDeclarations": {
188193
"Custom-Json-stream": {
@@ -218,7 +223,7 @@ Use the following ARM template to create a DCR for collecting text log files, ma
218223
"logFiles": [
219224
{
220225
"streams": [
221-
"Custom-JSONLog-stream"
226+
"Custom-Json-stream"
222227
],
223228
"filePatterns": [
224229
"[parameters('filePatterns')]"
@@ -231,7 +236,7 @@ Use the following ARM template to create a DCR for collecting text log files, ma
231236
"destinations": {
232237
"logAnalytics": [
233238
{
234-
"workspaceResourceId": "[parameters('workspaceResourceId')]",
239+
"workspaceResourceId": "[parameters('workspaceResourceId')]",
235240
"name": "workspace"
236241
}
237242
]

0 commit comments

Comments
 (0)