You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/azure-monitor/agents/data-collection-log-json.md
+32-27Lines changed: 32 additions & 27 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -52,8 +52,8 @@ Adhere to the following recommendations to ensure that you don't experience data
52
52
## Custom table
53
53
Before you can collect log data from a JSON file, you must create a custom table in your Log Analytics workspace to receive the data. The table schema must match the columns in the incoming stream, or you must add a transformation to ensure that the output schema matches the table.
54
54
55
-
>[!WARNING]
56
-
> You shouldn't use an existing custom table used by Log Analytics agent. The legacy agents won't be able to write to the table once the first Azure Monitor agent writes to it. Create a new table for Azure Monitor agent to use to prevent Log Analytics agent data loss.
55
+
>
56
+
> Warning: You shouldn’t use an existing custom table used by MMA agents. Your MMA agents won't be able to write to the table once the first AMA agent writes to the table. You should create a new table for AMA to use to prevent MMA data loss.
57
57
>
58
58
59
59
For example, you can use the following PowerShell script to create a custom table with multiple columns.
> The agent based JSON custom file ingestion is currently in preview and does not have a complete UI experience in the portal yet. While you can create the DCR using the portal, you must modify it to define the columns in the incoming stream. This section includes details on creating the DCR using an ARM template.
106
-
107
-
### Incoming stream schema
108
-
JSON files include a property name with each value, and the incoming stream in the DCR needs to include a column matching the name of each property. You need to modify the `columns` section of the ARM template with the columns from your log.
105
+
> The agent based JSON custom file ingestion is currently in preview and does not have a complete UI experience in the portal yet. While you can create the DCR using the portal, you must modify it to define the columns in the incoming stream. See the **Resource Manager template** tab for details on creating the required DCR.
109
106
110
-
The following table describes optional columns that you can include in addition to the columns defining the data in your log file.
107
+
### Incoming stream
108
+
JSON files include a property name with each value, and the incoming stream in the DCR needs to include a column matching the name of each property. If you create the DCR using the Azure portal, the columns in the following table will be included in the incoming stream, and you must manually modify the DCR or create it using another method where you can explicitly define the incoming stream.
111
109
112
110
| Column | Type | Description |
113
111
|:---|:---|:---|
114
-
|`TimeGenerated`| datetime | The time the record was generated. This value will be automatically populated with the time the record is added to the Log Analytics workspace if it's not included in the incoming stream. |
112
+
|`TimeGenerated`| datetime | The time the record was generated. |
113
+
|`RawData`| string | This column will be empty for a JSON log. |
115
114
|`FilePath`| string | If you add this column to the incoming stream in the DCR, it will be populated with the path to the log file. This column is not created automatically and can't be added using the portal. You must manually modify the DCR created by the portal or create the DCR using another method where you can explicitly define the incoming stream. |
116
115
117
-
### Transformation
118
-
The [transformation](../essentials/data-collection-transformations.md) potentially modifies the incoming stream to filter records or to modify the schema to match the target table. If the schema of the incoming stream is the same as the target table, then you can use the default transformation of `source`. If not, then modify the `transformKql` section of tee ARM template with a KQL query that returns the required schema.
116
+
### [Portal](#tab/portal)
119
117
120
-
### ARM template
121
-
122
-
Use the following ARM template to create a DCR for collecting text log files, making the changes described in the previous sections. The following table describes the parameters that require values when you deploy the template.
118
+
Create a data collection rule, as described in [Collect data with Azure Monitor Agent](./azure-monitor-agent-data-collection.md). In the **Collect and deliver** step, select **JSON Logs** from the **Data source type** dropdown.
123
119
124
120
| Setting | Description |
125
121
|:---|:---|
126
-
| Data collection rule name | Unique name for the DCR. |
127
-
| Location | Region for the DCR. Must be the same location as the Log Analytics workspace. |
128
-
| File patterns | Identifies the location and name of log files on the local disk. Use a wildcard for filenames that vary, for example when a new file is created each day with a new name. You can enter multiple file patterns separated by commas (AMA version 1.26 or higher required for multiple file patterns on Linux).<br><br>Examples:<br>- C:\Logs\MyLog.json<br>- C:\Logs\MyLog*.json<br>- C:\App01\AppLog.json, C:\App02\AppLog.json<br>- /var/mylog.json<br>- /var/mylog*.json |
122
+
| File pattern | Identifies the location and name of log files on the local disk. Use a wildcard for filenames that vary, for example when a new file is created each day with a new name. You can enter multiple file patterns separated by commas.<br><br>Examples:<br>- C:\Logs\MyLog.json<br>- C:\Logs\MyLog*.json<br>- C:\App01\AppLog.json, C:\App02\AppLog.json<br>- /var/mylog.json<br>- /var/mylog*.json |
129
123
| Table name | Name of the destination table in your Log Analytics Workspace. |
130
-
| Workspace resource ID | Resource ID of the Log Analytics workspace with the target table. |
124
+
| Record delimiter | Not currently used but reserved for future potential use allowing delimiters other than the currently supported end of line (`/r/n`). |
125
+
| Transform |[Ingestion-time transformation](../essentials/data-collection-transformations.md) to filter records or to format the incoming data for the destination table. Use `source` to leave the incoming data unchanged. |
126
+
127
+
128
+
129
+
### [Resource Manager template](#tab/arm)
130
+
131
+
Use the following ARM template to create a DCR for collecting text log files. In addition to the parameter values, you may need to modify the following values in the template:
132
+
133
+
-`columns`: Modify with the list of columns in the JSON log that you're collecting.
134
+
-`transformKql`: Modify the default transformation if the schema of the incoming stream doesn't match the schema of the target table. The output schema of the transformation must match the schema of the target table.
131
135
132
136
> [!IMPORTANT]
133
-
> When you create the DCR using an ARM template, you still must associate the DCR with the agents that will use it. You can edit the DCR in the Azure portal and select the agents as described in [Add resources](./azure-monitor-agent-data-collection.md#add-resources)
137
+
> If you create the DCR using an ARM template, you still must associate the DCR with the agents that will use it. You can edit the DCR in the Azure portal and select the agents as described in [Add resources](./azure-monitor-agent-data-collection.md#add-resources)
138
+
134
139
135
140
```json
136
141
{
@@ -168,21 +173,21 @@ Use the following ARM template to create a DCR for collecting text log files, ma
168
173
}
169
174
},
170
175
"dataCollectionEndpointResourceId": {
171
-
"type": "string",
172
-
"metadata": {
176
+
"type": "string",
177
+
"metadata": {
173
178
"description": "Resource ID of the Data Collection Endpoint to be used with this rule."
0 commit comments