You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/azure-monitor/agents/data-collection-log-json.md
+37-41Lines changed: 37 additions & 41 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -52,8 +52,8 @@ Adhere to the following recommendations to ensure that you don't experience data
52
52
## Custom table
53
53
Before you can collect log data from a JSON file, you must create a custom table in your Log Analytics workspace to receive the data. The table schema must match the columns in the incoming stream, or you must add a transformation to ensure that the output schema matches the table.
54
54
55
-
>
56
-
> Warning: You shouldn’t use an existing custom table used by MMA agents. Your MMA agents won't be able to write to the table once the first AMA agent writes to the table. You should create a new table for AMA to use to prevent MMA data loss.
55
+
>[!WARNING]
56
+
> You shouldn't use an existing custom table used by Log Analytics agent. The legacy agents won't be able to write to the table once the first Azure Monitor agent writes to it. Create a new table for Azure Monitor agent to use to prevent Log Analytics agent data loss.
57
57
>
58
58
59
59
For example, you can use the following PowerShell script to create a custom table with multiple columns.
> The agent based JSON custom file ingestion is currently in preview and does not have a complete UI experience in the portal yet. While you can create the DCR using the portal, you must modify it to define the columns in the incoming stream. See the **Resource Manager template** tab for details on creating the required DCR.
105
+
> The agent based JSON custom file ingestion is currently in preview and does not have a complete UI experience in the portal yet. While you can create the DCR using the portal, you must modify it to define the columns in the incoming stream. This section includes details on creating the DCR using an ARM template.
106
+
107
+
### Incoming stream schema
108
+
JSON files include a property name with each value, and the incoming stream in the DCR needs to include a column matching the name of each property. You need to modify the `columns` section of the ARM template with the columns from your log.
106
109
107
-
### Incoming stream
108
-
JSON files include a property name with each value, and the incoming stream in the DCR needs to include a column matching the name of each property. If you create the DCR using the Azure portal, the columns in the following table will be included in the incoming stream, and you must manually modify the DCR or create it using another method where you can explicitly define the incoming stream.
110
+
The following table describes optional columns that you can include in addition to the columns defining the data in your log file.
109
111
110
112
| Column | Type | Description |
111
113
|:---|:---|:---|
112
-
|`TimeGenerated`| datetime | The time the record was generated. |
113
-
|`RawData`| string | This column will be empty for a JSON log. |
114
+
|`TimeGenerated`| datetime | The time the record was generated. This value will be automatically populated with the time the record is added to the Log Analytics workspace if it's not included in the incoming stream. |
114
115
|`FilePath`| string | If you add this column to the incoming stream in the DCR, it will be populated with the path to the log file. This column is not created automatically and can't be added using the portal. You must manually modify the DCR created by the portal or create the DCR using another method where you can explicitly define the incoming stream. |
115
116
116
-
### [Portal](#tab/portal)
117
+
### Transformation
118
+
The [transformation](../essentials/data-collection-transformations.md) potentially modifies the incoming stream to filter records or to modify the schema to match the target table. If the schema of the incoming stream is the same as the target table, then you can use the default transformation of `source`. If not, then modify the `transformKql` section of tee ARM template with a KQL query that returns the required schema.
117
119
118
-
Create a data collection rule, as described in [Collect data with Azure Monitor Agent](./azure-monitor-agent-data-collection.md). In the **Collect and deliver** step, select **JSON Logs** from the **Data source type** dropdown.
120
+
### ARM template
121
+
122
+
Use the following ARM template to create a DCR for collecting text log files, making the changes described in the previous sections. The following table describes the parameters that require values when you deploy the template.
119
123
120
124
| Setting | Description |
121
125
|:---|:---|
122
-
| File pattern | Identifies the location and name of log files on the local disk. Use a wildcard for filenames that vary, for example when a new file is created each day with a new name. You can enter multiple file patterns separated by commas.<br><br>Examples:<br>- C:\Logs\MyLog.json<br>- C:\Logs\MyLog*.json<br>- C:\App01\AppLog.json, C:\App02\AppLog.json<br>- /var/mylog.json<br>- /var/mylog*.json |
126
+
| Data collection rule name | Unique name for the DCR. |
127
+
| Location | Region for the DCR. Must be the same location as the Log Analytics workspace. |
128
+
| File patterns | Identifies the location and name of log files on the local disk. Use a wildcard for filenames that vary, for example when a new file is created each day with a new name. You can enter multiple file patterns separated by commas (AMA version 1.26 or higher required for multiple file patterns on Linux).<br><br>Examples:<br>- C:\Logs\MyLog.json<br>- C:\Logs\MyLog*.json<br>- C:\App01\AppLog.json, C:\App02\AppLog.json<br>- /var/mylog.json<br>- /var/mylog*.json |
123
129
| Table name | Name of the destination table in your Log Analytics Workspace. |
124
-
| Record delimiter | Not currently used but reserved for future potential use allowing delimiters other than the currently supported end of line (`/r/n`). |
125
-
| Transform |[Ingestion-time transformation](../essentials/data-collection-transformations.md) to filter records or to format the incoming data for the destination table. Use `source` to leave the incoming data unchanged. |
126
-
127
-
128
-
129
-
### [Resource Manager template](#tab/arm)
130
-
131
-
Use the following ARM template to create a DCR for collecting text log files. In addition to the parameter values, you may need to modify the following values in the template:
132
-
133
-
-`columns`: Modify with the list of columns in the JSON log that you're collecting.
134
-
-`transformKql`: Modify the default transformation if the schema of the incoming stream doesn't match the schema of the target table. The output schema of the transformation must match the schema of the target table.
130
+
| Workspace resource ID | Resource ID of the Log Analytics workspace with the target table. |
135
131
136
132
> [!IMPORTANT]
137
-
> If you create the DCR using an ARM template, you still must associate the DCR with the agents that will use it. You can edit the DCR in the Azure portal and select the agents as described in [Add resources](./azure-monitor-agent-data-collection.md#add-resources)
138
-
133
+
> When you create the DCR using an ARM template, you still must associate the DCR with the agents that will use it. You can edit the DCR in the Azure portal and select the agents as described in [Add resources](./azure-monitor-agent-data-collection.md#add-resources)
139
134
140
135
```json
141
136
{
@@ -145,51 +140,52 @@ Use the following ARM template to create a DCR for collecting text log files. In
145
140
"dataCollectionRuleName": {
146
141
"type": "string",
147
142
"metadata": {
148
-
"description": "Unique name for the DCR. "
143
+
"description": "Unique name for the DCR. "
149
144
}
150
145
},
151
146
"location": {
152
147
"type": "string",
153
148
"metadata": {
154
-
"description": "Region for the DCR. Must be the same location as the Log Analytics workspace. "
149
+
"description": "Region for the DCR. Must be the same location as the Log Analytics workspace. "
155
150
}
156
151
},
157
152
"filePatterns": {
158
153
"type": "string",
159
154
"metadata": {
160
-
"description": "Path on the local disk for the log file to collect. May include wildcards.Enter multiple file patterns separated by commas (AMA version 1.26 or higher required for multiple file patterns on Linux)."
155
+
"description": "Path on the local disk for the log file to collect. May include wildcards.Enter multiple file patterns separated by commas (AMA version 1.26 or higher required for multiple file patterns on Linux)."
161
156
}
162
157
},
163
158
"tableName": {
164
159
"type": "string",
165
160
"metadata": {
166
-
"description": "Name of destination table in your Log Analytics workspace. "
161
+
"description": "Name of destination table in your Log Analytics workspace. "
167
162
}
168
163
},
169
164
"workspaceResourceId": {
170
165
"type": "string",
171
166
"metadata": {
172
-
"description": "Resource ID of the Log Analytics workspace with the target table."
167
+
"description": "Resource ID of the Log Analytics workspace with the target table."
173
168
}
174
169
},
175
170
"dataCollectionEndpointResourceId": {
176
-
"type": "string",
177
-
"metadata": { "description": "Resource ID of the Data Collection Endpoint to be used with this rule."
178
-
}
179
-
}
171
+
"type": "string",
172
+
"metadata": {
173
+
"description": "Resource ID of the Data Collection Endpoint to be used with this rule."
0 commit comments