You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/azure-monitor/logs/tutorial-logs-ingestion-portal.md
+18-15Lines changed: 18 additions & 15 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -11,7 +11,7 @@ ms.service: azure-monitor
11
11
---
12
12
13
13
# Tutorial: Send data to Azure Monitor Logs with Logs ingestion API (Azure portal)
14
-
The [Logs Ingestion API](logs-ingestion-api-overview.md) in Azure Monitor allows you to send external data to a Log Analytics workspace with a REST API. This tutorial uses the Azure portal to walk through configuration of a new table and a sample application to send log data to Azure Monitor. The sample application collects entries from a text file and
14
+
The [Logs Ingestion API](logs-ingestion-api-overview.md) in Azure Monitor allows you to send external data to a Log Analytics workspace with a REST API. This tutorial uses the Azure portal to walk through configuration of a new table and a sample application to send log data to Azure Monitor. The sample application collects entries from a text file and either converts the plain log to JSON format generating a resulting .json file, or sends the content to the data collection endpoint.
15
15
16
16
> [!NOTE]
17
17
> This tutorial uses the Azure portal to configure the components to support the Logs ingestion API. See [Tutorial: Send data to Azure Monitor using Logs ingestion API (Resource Manager templates)](tutorial-logs-ingestion-api.md) for a similar tutorial that uses Azure Resource Manager templates to configure these components and that has sample code for client libraries for [.NET](/dotnet/api/overview/azure/Monitor.Ingestion-readme), [Java](/java/api/overview/azure/monitor-ingestion-readme), [JavaScript](/javascript/api/overview/azure/monitor-ingestion-readme), and [Python](/python/api/overview/azure/monitor-ingestion-readme).
@@ -95,7 +95,9 @@ Before you can send data to the workspace, you need to create the custom table w
95
95
:::image type="content" source="media/tutorial-logs-ingestion-portal/custom-log-table-name.png" lightbox="media/tutorial-logs-ingestion-portal/custom-log-table-name.png" alt-text="Screenshot that shows the custom log table name.":::
96
96
97
97
## Parse and filter sample data
98
-
Instead of directly configuring the schema of the table, you can upload a file with a sample JSON array of data through the portal, and Azure Monitor will set the schema automatically. The sample JSON file must contain one or more log records structured as an array, in the same way they data is sent in the body of an HTTP request of the logs ingestion API call.
98
+
Instead of directly configuring the schema of the table, you can upload a file with a sample JSON array of data through the portal, and Azure Monitor will set the schema automatically. The sample JSON file must contain one or more log records structured as an array, in the same way the data is sent in the body of an HTTP request of the logs ingestion API call.
99
+
100
+
1. Follow the instructions in [generate sample data](#generate-sample-data) to create the *data_sample.json* file.
99
101
100
102
1. Select **Browse for files** and locate the *data_sample.json* file that you previously created.
101
103
@@ -128,10 +130,10 @@ Instead of directly configuring the schema of the table, you can upload a file w
128
130
' ' *
129
131
' ' *
130
132
' [' * '] "' RequestType:string
131
-
" " Resource:string
132
-
" " *
133
+
' ' Resource:string
134
+
' ' *
133
135
'" ' ResponseCode:int
134
-
" " *
136
+
' ' *
135
137
```
136
138
137
139
1. Select **Run** to view the results. This action extracts the contents of `RawData` into the separate columns `ClientIP`, `RequestType`, `Resource`, and `ResponseCode`.
@@ -143,16 +145,17 @@ Instead of directly configuring the schema of the table, you can upload a file w
143
145
```kusto
144
146
source
145
147
| extend TimeGenerated = todatetime(Time)
146
-
| parse kind = regex RawData with *
147
-
':"'
148
+
| parse RawData with
148
149
ClientIP:string
149
-
" - -" * '"'
150
-
RequestType:string
151
-
' '
152
-
Resource:string
153
-
" " *
150
+
' ' *
151
+
' ' *
152
+
' [' * '] "' RequestType:string
153
+
' ' Resource:string
154
+
' ' *
154
155
'" ' ResponseCode:int
155
-
" " *
156
+
' ' *
157
+
| project-away Time, RawData
158
+
| where ResponseCode != 200
156
159
```
157
160
158
161
1. Select **Run** to view the results.
@@ -255,7 +258,7 @@ The following PowerShell script generates sample data to configure the custom ta
0 commit comments