You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
> The agent based JSON custom file ingestion is currently in preview and does not have a complete UI experience in the portal yet. While you can create the DCR using the portal, you must modify it to define the columns in the incoming stream. This section includes details on creating the DCR using an ARM template.
106
106
107
107
### Incoming stream schema
108
+
JSON files include a property name with each value, and the incoming stream in the DCR needs to include a column matching the name of each property. You need to modify the `columns` section of the ARM template with the columns from your log.
108
109
109
-
JSON files include a property name with each value, and the incoming stream in the DCR needs to include a column matching the name of each property. The following table describes optional columns that you can include in addition to the columns defining the data in your log file.
110
+
The following table describes optional columns that you can include in addition to the columns defining the data in your log file.
110
111
111
112
| Column | Type | Description |
112
113
|:---|:---|:---|
113
114
|`TimeGenerated`| datetime | The time the record was generated. This value will be automatically populated with the time the record is added to the Log Analytics workspace if it's not included in the incoming stream. |
114
115
|`FilePath`| string | If you add this column to the incoming stream in the DCR, it will be populated with the path to the log file. This column is not created automatically and can't be added using the portal. You must manually modify the DCR created by the portal or create the DCR using another method where you can explicitly define the incoming stream. |
115
116
116
-
Use the following ARM template to create a DCR for collecting text log files. You may need to modify the following values in the template itself:
117
+
### Transformation
118
+
The [transformation](../essentials/data-collection-transformations.md) potentially modifies the incoming stream to filter records or to modify the schema to match the target table. If the schema of the incoming stream is the same as the target table, then you can use the default transformation of `source`. If not, then modify the `transformKql` section of tee ARM template with a KQL query that returns the required schema.
117
119
118
-
-`columns`: Modify with the list of columns in the JSON log that you're collecting.
119
-
-`transformKql`: Modify the default transformation if the schema of the incoming stream doesn't match the schema of the target table. The output schema of the [transformation](../essentials/data-collection-transformations.md) must match the schema of the target table.
120
+
### ARM template
120
121
121
-
The following table describes the parameters.
122
+
Use the following ARM template to create a DCR for collecting text log files, making the changes described in the previous sections. The following table describes the parameters that require values when you deploy the template.
0 commit comments