You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/azure-monitor/logs/logs-export-logic-app.md
+80-65Lines changed: 80 additions & 65 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -22,7 +22,7 @@ The method described in this article describes a scheduled export from a log que
22
22
## Overview
23
23
This procedure uses the [Azure Monitor Logs connector](/connectors/azuremonitorlogs/) which lets you run a log query from a Logic App and use its output in other actions in the workflow. The [Azure Blob Storage connector](/connectors/azureblob/) is used in this procedure to send the query output to Azure storage.
[](media/logs-export-logic-app/logic-app-overview.png#lightbox)
26
26
27
27
When you export data from a Log Analytics workspace, you should filter and aggregate your log data and optimize query and limit the amount of data processed by your Logic App workflow, to the required data. For example, if you need to archive sign-in events, you should filter for required events and project only the required fields. For example:
28
28
@@ -57,31 +57,35 @@ Log Analytics workspace and log queries in Azure Monitor are multitenancy servic
57
57
58
58
1.**Create Logic App**
59
59
60
-
1. Go to **Logic Apps** in the Azure portal and click **Add**. Select a **Subscription**, **Resource group**, and **Region** to store the new Logic App and then give it a unique name. You can turn on **Log Analytics** setting to collect information about runtime data and events as described in [Set up Azure Monitor logs and collect diagnostics data for Azure Logic Apps](../../logic-apps/monitor-logic-apps-log-analytics.md). This setting isn't required for using the Azure Monitor Logs connector.<br>
1. Go to **Logic Apps** in the Azure portal and click **Add**. Select a **Subscription**, **Resource group**, and **Region** to store the new Logic App and then give it a unique name. You can turn on **Log Analytics** setting to collect information about runtime data and events as described in [Set up Azure Monitor logs and collect diagnostics data for Azure Logic Apps](../../logic-apps/monitor-logic-apps-log-analytics.md). This setting isn't required for using the Azure Monitor Logs connector.
61
+
\
62
+
[](media/logs-export-logic-app/create-logic-app.png#lightbox)
62
63
63
-
1. Click **Review + create** and then **Create**. When the deployment is complete, click **Go to resource** to open the **Logic Apps Designer**.
64
+
2. Click **Review + create** and then **Create**. When the deployment is complete, click **Go to resource** to open the **Logic Apps Designer**.
64
65
65
-
1.**Create a trigger for the Logic App**
66
+
2.**Create a trigger for the Logic App**
66
67
67
-
1. Under **Start with a common trigger**, select **Recurrence**. This creates a Logic App that automatically runs at a regular interval. In the **Frequency** box of the action, select **Day** and in the **Interval** box, enter **1** to run the workflow once per day.<br>
1. Under **Start with a common trigger**, select **Recurrence**. This creates a Logic App that automatically runs at a regular interval. In the **Frequency** box of the action, select **Day** and in the **Interval** box, enter **1** to run the workflow once per day.
69
+
\
70
+
[](media/logs-export-logic-app/recurrence-action.png#lightbox)
69
71
70
-
2.**Add Azure Monitor Logs action**
72
+
3.**Add Azure Monitor Logs action**
71
73
72
74
The Azure Monitor Logs action lets you specify the query to run. The log query used in this example is optimized for hourly recurrence and collects the data ingested for the particular execution time. For example, if the workflow runs at 4:35, the time range would be 3:00 to 4:00. If you change the Logic App to run at a different frequency, you need the change the query as well. For example, if you set the recurrence to run daily, you would set startTime in the query to startofday(make_datetime(year,month,day,0,0)).
73
75
74
76
You will be prompted to select a tenant to grant access to the Log Analytics workspace with the account that the workflow will use to run the query.
75
77
76
-
1. Click **+ New step** to add an action that runs after the recurrence action. Under **Choose an action**, type **azure monitor** and then select **Azure Monitor Logs**.<br>
1. Click **+ New step** to add an action that runs after the recurrence action. Under **Choose an action**, type **azure monitor** and then select **Azure Monitor Logs**.
79
+
\
80
+
[](media/logs-export-logic-app/select-azure-monitor-connector.png#lightbox)
78
81
79
-
2. Click **Azure Log Analytics – Run query and list results**.<br>
80
-
[](media/logs-export-logic-app/select-query-action-list.png#lightbox)
82
+
1. Click **Azure Log Analytics – Run query and list results**.
83
+
\
84
+
[](media/logs-export-logic-app/select-query-action-list.png#lightbox)
81
85
82
-
3. Select the **Subscription** and **Resource Group** for your Log Analytics workspace. Select *Log Analytics Workspace* for the **Resource Type** and then select the workspace's name under **Resource Name**.
86
+
2. Select the **Subscription** and **Resource Group** for your Log Analytics workspace. Select *Log Analytics Workspace* for the **Resource Type** and then select the workspace's name under **Resource Name**.
83
87
84
-
4. Add the following log query to the **Query** window.
88
+
3. Add the following log query to the **Query** window.
85
89
86
90
```Kusto
87
91
let dt = now();
@@ -108,81 +112,92 @@ Log Analytics workspace and log queries in Azure Monitor are multitenancy servic
108
112
ResourceId = _ResourceId
109
113
```
110
114
111
-
5. The **Time Range** specifies the records that will be included in the query based on the **TimeGenerated** column. This should be set to a value greater than the time range selected in the query. Since this query isn't using the **TimeGenerated** column, then **Set in query** option isn't available. See [Query scope](./scope.md) for more details about the time range. Select **Last 4 hours** for the **Time Range**. This will ensure that any records with an ingestion time larger than **TimeGenerated** will be included in the results.<br>
112
-
[](media/logs-export-logic-app/run-query-list-action.png#lightbox)
115
+
4. The **Time Range** specifies the records that will be included in the query based on the **TimeGenerated** column. This should be set to a value greater than the time range selected in the query. Since this query isn't using the **TimeGenerated** column, then **Set in query** option isn't available. See [Query scope](./scope.md) for more details about the time range. Select **Last 4 hours** for the **Time Range**. This will ensure that any records with an ingestion time larger than **TimeGenerated** will be included in the results.
116
+
\
117
+
[](media/logs-export-logic-app/run-query-list-action.png#lightbox)
113
118
114
-
3. **Add Parse JSON activity (optional)**
119
+
4. **Add Parse JSON activity (optional)**
115
120
116
121
The output from the **Run query and list results** action is formatted in JSON. You can parse this data and manipulate it as part of the preparation for **Compose** action.
117
122
118
-
You can provide a JSON schema that describes the payload you expect to receive. The designer parses JSON content by using this schema and generates user-friendly tokens that represent the properties in your JSON content. You can then easily reference and use those properties throughout your Logic App's workflow.
119
-
120
-
1. Click **+ New step**, and then click **+ Add an action**. Under **Choose an action**, type **json** and then select **Parse JSON**.<br>
2. Click in the **Content** box to display a list of values from previous activities. Select **Body** from the **Run query and list results** action. This is the output from the log query.<br>
3. Click **Use sample payload to generate schema**. Run the log query and copy the output to use for the sample payload. For the sample query here, you can use the following output:
127
-
128
-
```json
129
-
{
130
-
"TimeGenerated": "2020-09-29T23:11:02.578Z",
131
-
"BlobTime": "2020-09-29T23:00:00Z",
132
-
"OperationName": "Returns Storage Account SAS Token",
You can provide a JSON schema that describes the payload you expect to receive. The designer parses JSON content by using this schema and generates user-friendly tokens that represent the properties in your JSON content. You can then easily reference and use those properties throughout your Logic App's workflow.
124
+
125
+
You can use a sample output from **Run query and list results** step. Click **Run Trigger** in Logic App ribbon, then **Run**, download and save an output record. For the sample query in previous stem, you can use the following sample output:
126
+
127
+
```json
128
+
{
129
+
"TimeGenerated": "2020-09-29T23:11:02.578Z",
130
+
"BlobTime": "2020-09-29T23:00:00Z",
131
+
"OperationName": "Returns Storage Account SAS Token",
1. Click **+ New step**, and then click **+ Add an action**. Under **Choose an action**, type **json** and then select **Parse JSON**.
145
+
\
146
+
[](media/logs-export-logic-app/select-parse-json.png#lightbox)
147
+
148
+
1. Click in the **Content** box to display a list of values from previous activities. Select **Body** from the **Run query and list results** action. This is the output from the log query.
149
+
\
150
+
[](media/logs-export-logic-app/select-body.png#lightbox)
151
+
152
+
1. Copy the sample record saved earlier, click **Use sample payload to generate schema** and paste.
153
+
\
154
+
[](media/logs-export-logic-app/parse-json-payload.png#lightbox)
155
+
156
+
5. **Add the Compose action**
148
157
149
158
The **Compose** action takes the parsed JSON output and creates the object that you need to store in the blob.
150
159
151
-
1. Click **+ New step**, and then click **+ Add an action**. Under **Choose an action**, type **compose** and then select the **Compose** action.<br>
1. Click **+ New step**, and then click **+ Add an action**. Under **Choose an action**, type **compose** and then select the **Compose** action.
161
+
\
162
+
[](media/logs-export-logic-app/select-compose.png#lightbox)
153
163
154
-
2. Click the **Inputs** box display a list of values from previous activities. Select **Body** from the **Parse JSON** action. This is the parsed output from the log query.<br>
155
-
[](media/logs-export-logic-app/select-body-compose.png#lightbox)
164
+
1. Click the **Inputs** box display a list of values from previous activities. Select **Body** from the **Parse JSON** action. This is the parsed output from the log query.
165
+
\
166
+
[](media/logs-export-logic-app/select-body-compose.png#lightbox)
156
167
157
-
5. **Add the Create Blob action**
168
+
6. **Add the Create Blob action**
158
169
159
170
The Create Blob action writes the composed JSON to storage.
160
171
161
-
1. Click **+ New step**, and then click **+ Add an action**. Under **Choose an action**, type **blob** and then select the **Create Blob** action.<br>
1. Click **+ New step**, and then click **+ Add an action**. Under **Choose an action**, type **blob** and then select the **Create Blob** action.
173
+
\
174
+
[](media/logs-export-logic-app/select-create-blob.png#lightbox)
163
175
164
-
2. Type a name for the connection to your Storage Account in **Connection Name** and then click the folder icon in the **Folder path** box to select the container in your Storage Account. Click the **Blob name** to see a list of values from previous activities. Click **Expression** and enter an expression that matches your time interval. For this query which is run hourly, the following expression sets the blob name per previous hour:
176
+
1. Type a name for the connection to your Storage Account in **Connection Name** and then click the folder icon in the **Folder path** box to select the container in your Storage Account. Click the **Blob name** to see a list of values from previous activities. Click **Expression** and enter an expression that matches your time interval. For this query which is run hourly, the following expression sets the blob name per previous hour:
2. Click the **Blob content** box to display a list of values from previous activities and then select **Outputs** in the **Compose** section.
185
+
\
186
+
[](media/logs-export-logic-app/create-blob.png#lightbox)
174
187
175
188
176
-
6. **Test the Logic App**
189
+
7. **Test the Logic App**
177
190
178
-
Test the workflow by clicking **Run**. If the workflow has errors, it will be indicated on the step with the problem. You can view the executions and drill in to each step to view the input and output to investigate failures. See [Troubleshoot and diagnose workflow failures in Azure Logic Apps](../../logic-apps/logic-apps-diagnosing-failures.md) if necessary.<br>
Test the workflow by clicking **Run**. If the workflow has errors, it will be indicated on the step with the problem. You can view the executions and drill in to each step to view the input and output to investigate failures. See [Troubleshoot and diagnose workflow failures in Azure Logic Apps](../../logic-apps/logic-apps-diagnosing-failures.md) if necessary.
192
+
\
193
+
[](media/logs-export-logic-app/runs-history.png#lightbox)
180
194
181
195
182
-
7. **View logs in Storage**
196
+
8. **View logs in Storage**
183
197
184
-
Go to the **Storage accounts** menu in the Azure portal and select your Storage Account. Click the **Blobs** tile and select the container you specified in the Create blob action. Select one of the blobs and then **Edit blob**.<br>
Go to the **Storage accounts** menu in the Azure portal and select your Storage Account. Click the **Blobs** tile and select the container you specified in the Create blob action. Select one of the blobs and then **Edit blob**.
199
+
\
200
+
[](media/logs-export-logic-app/blob-data.png#lightbox)
0 commit comments