Skip to content

Commit a13f502

Browse files
authored
Merge pull request #227395 from ecfan/diagnostics
Azure Logic Apps: [Standard] Diagnostic settings (preview)
2 parents 3633a76 + c586461 commit a13f502

File tree

62 files changed

+458
-352
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

62 files changed

+458
-352
lines changed

.openpublishing.redirection.json

Lines changed: 7 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13241,7 +13241,7 @@
1324113241
},
1324213242
{
1324313243
"source_path_from_root": "/articles/logic-apps/logic-apps-monitor-your-logic-apps-oms.md",
13244-
"redirect_url": "/azure/logic-apps/monitor-logic-apps-log-analytics",
13244+
"redirect_url": "/azure/logic-apps/monitor-workflows-collect-diagnostic-data",
1324513245
"redirect_document_id": false
1324613246
},
1324713247
{
@@ -13339,6 +13339,12 @@
1333913339
"redirect_url": "/connectors/custom-connectors/submit-certification",
1334013340
"redirect_document_id": false
1334113341
},
13342+
{
13343+
"source_path_from_root": "/articles/logic-apps/monitor-logic-apps-log-analytics.md",
13344+
"redirect_url": "/azure/logic-apps/monitor-workflows-collect-diagnostic-data",
13345+
"redirect_document_id": true
13346+
},
13347+
1334213348
{
1334313349
"source_path_from_root": "/articles/connectors/connectors-create-api-sharepointonline.md",
1334413350
"redirect_url": "/azure/connectors/connectors-create-api-sharepoint",

articles/azure-functions/functions-compare-logic-apps-ms-flow-webjobs.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -53,7 +53,7 @@ You can mix and match services when you build an orchestration, such as calling
5353
| **Development** | Code-first (imperative) | Designer-first (declarative) |
5454
| **Connectivity** | [About a dozen built-in binding types](functions-triggers-bindings.md#supported-bindings), write code for custom bindings | [Large collection of connectors](/connectors/connector-reference/connector-reference-logicapps-connectors), [Enterprise Integration Pack for B2B scenarios](../logic-apps/logic-apps-enterprise-integration-overview.md), [build custom connectors](/connectors/custom-connectors/) |
5555
| **Actions** | Each activity is an Azure function; write code for activity functions |[Large collection of ready-made actions](/connectors/connector-reference/connector-reference-logicapps-connectors)|
56-
| **Monitoring** | [Azure Application Insights](../azure-monitor/app/app-insights-overview.md) | [Azure portal](../logic-apps/quickstart-create-first-logic-app-workflow.md), [Azure Monitor logs](../logic-apps/monitor-logic-apps-log-analytics.md), [Microsoft Defender for Cloud](../logic-apps/healthy-unhealthy-resource.md) |
56+
| **Monitoring** | [Azure Application Insights](../azure-monitor/app/app-insights-overview.md) | [Azure portal](../logic-apps/quickstart-create-first-logic-app-workflow.md), [Azure Monitor Logs](../logic-apps/monitor-workflows-collect-diagnostic-data.md), [Microsoft Defender for Cloud](../logic-apps/healthy-unhealthy-resource.md) |
5757
| **Management** | [REST API](durable/durable-functions-http-api.md), [Visual Studio](/visualstudio/azure/vs-azure-tools-resources-managing-with-cloud-explorer) | [Azure portal](../logic-apps/quickstart-create-first-logic-app-workflow.md), [REST API](/rest/api/logic/), [PowerShell](/powershell/module/az.logicapp), [Visual Studio](../logic-apps/manage-logic-apps-with-visual-studio.md) |
5858
| **Execution context** | Can run [locally](./functions-kubernetes-keda.md) or in the cloud | Runs in Azure, locally, or on premises. For more information, see [What is Azure Logic Apps](../logic-apps/logic-apps-overview.md#resource-environment-differences). |
5959

articles/azure-monitor/logs/create-pipeline-datacollector-api.md

Lines changed: 16 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -28,30 +28,30 @@ We are using a classic ETL-type logic to design our pipeline. The architecture w
2828

2929
![Data collection pipeline architecture](./media/create-pipeline-datacollector-api/data-pipeline-dataflow-architecture.png)
3030

31-
This article will not cover how to create data or [upload it to an Azure Blob Storage account](../../storage/blobs/storage-upload-process-images.md). Rather, we pick the flow up as soon as a new file is uploaded to the blob. From here:
31+
This article will not cover how to create data or [upload it to an Azure Blob Storage account](../../storage/blobs/blob-upload-function-trigger.md). Rather, we pick the flow up as soon as a new file is uploaded to the blob. From here:
3232

33-
1. A process will detect that new data has been uploaded. Our example uses an [Azure Logic App](../../logic-apps/logic-apps-overview.md), which has available a trigger to detect new data being uploaded to a blob.
33+
1. A process will detect that new data has been uploaded. Our example uses an [logic app workflow](../../logic-apps/logic-apps-overview.md), which has available a trigger to detect new data being uploaded to a blob.
3434

35-
2. A processor reads this new data and converts it to JSON, the format required by Azure Monitor In this example, we use an [Azure Function](../../azure-functions/functions-overview.md) as a lightweight, cost-efficient way of executing our processing code. The function is kicked off by the same Logic App that we used to detect the new data.
35+
2. A processor reads this new data and converts it to JSON, the format required by Azure Monitor In this example, we use an [Azure Function](../../azure-functions/functions-overview.md) as a lightweight, cost-efficient way of executing our processing code. The function is kicked off by the same logic app workflow that we used to detect the new data.
3636

37-
3. Finally, once the JSON object is available, it is sent to Azure Monitor. The same Logic App sends the data to Azure Monitor using the built in Log Analytics Data Collector activity.
37+
3. Finally, once the JSON object is available, it is sent to Azure Monitor. The same logic app workflow sends the data to Azure Monitor using the built in Log Analytics Data Collector activity.
3838

39-
While the detailed setup of the blob storage, Logic App, or Azure Function is not outlined in this article, detailed instructions are available on the specific products’ pages.
39+
While the detailed setup of the blob storage, logic app workflow, or Azure Function is not outlined in this article, detailed instructions are available on the specific products’ pages.
4040

41-
To monitor this pipeline, we use Application Insights to monitor our Azure Function [details here](../../azure-functions/functions-monitoring.md), and Azure Monitor to monitor our Logic App [details here](../../logic-apps/monitor-logic-apps-log-analytics.md).
41+
To monitor this pipeline, we use Application Insights to [monitor our Azure Function](../../azure-functions/functions-monitoring.md), and Azure Monitor to [monitor our logic app workflow](../../logic-apps/monitor-workflows-collect-diagnostic-data.md).
4242

4343
## Setting up the pipeline
4444
To set the pipeline, first make sure you have your blob container created and configured. Likewise, make sure that the Log Analytics workspace where you’d like to send the data to is created.
4545

4646
## Ingesting JSON data
47-
Ingesting JSON data is trivial with Logic Apps, and since no transformation needs to take place, we can encase the entire pipeline in a single Logic App. Once both the blob container and the Log Analytics workspace have been configured, create a new Logic App and configure it as follows:
47+
Ingesting JSON data is trivial with Azure Logic Apps, and since no transformation needs to take place, we can encase the entire pipeline in a single logic app workflow. Once both the blob container and the Log Analytics workspace have been configured, create a new logic app workflow and configure it as follows:
4848

4949
![Logic apps workflow example](./media/create-pipeline-datacollector-api/logic-apps-workflow-example-01.png)
5050

51-
Save your Logic App and proceed to test it.
51+
Save your logic app workflow and proceed to test it.
5252

5353
## Ingesting XML, CSV, or other formats of data
54-
Logic Apps today does not have built-in capabilities to easily transform XML, CSV, or other types into JSON format. Therefore, we need to use another means to complete this transformation. For this article, we use the serverless compute capabilities of Azure Functions as a very lightweight and cost-friendly way of doing so.
54+
monitor-workflows-collect-diagnostic-data today does not have built-in capabilities to easily transform XML, CSV, or other types into JSON format. Therefore, we need to use another means to complete this transformation. For this article, we use the serverless compute capabilities of Azure Functions as a very lightweight and cost-friendly way of doing so.
5555

5656
In this example, we parse a CSV file, but any other file type can be similarly processed. Simply modify the deserializing portion of the Azure Function to reflect the correct logic for your specific data type.
5757

@@ -93,7 +93,7 @@ In this example, we parse a CSV file, but any other file type can be similarly p
9393

9494
public static async Task<HttpResponseMessage> Run(HttpRequestMessage req, TraceWriter log)
9595
{
96-
string filePath = await req.Content.ReadAsStringAsync(); //get the CSV URI being passed from Logic App
96+
string filePath = await req.Content.ReadAsStringAsync(); //get the CSV URI being passed from logic app workflow
9797
string response = "";
9898

9999
//get a stream from blob
@@ -121,12 +121,12 @@ In this example, we parse a CSV file, but any other file type can be similarly p
121121

122122
![Function Apps test code](./media/create-pipeline-datacollector-api/functions-test-01.png)
123123

124-
Now we need to go back and modify the Logic App we started building earlier to include the data ingested and converted to JSON format. Using View Designer, configure as follows and then save your Logic App:
124+
Now we need to go back and modify the logic app we started building earlier to include the data ingested and converted to JSON format. Using View Designer, configure as follows and then save your logic app:
125125

126-
![Logic Apps workflow complete example](./media/create-pipeline-datacollector-api/logic-apps-workflow-example-02.png)
126+
![Azure Logic Apps workflow complete example](./media/create-pipeline-datacollector-api/logic-apps-workflow-example-02.png)
127127

128128
## Testing the pipeline
129-
Now you can upload a new file to the blob configured earlier and have it monitored by your Logic App. Soon, you should see a new instance of the Logic App kick off, call out to your Azure Function, and then successfully send the data to Azure Monitor.
129+
Now you can upload a new file to the blob configured earlier and have it monitored by your logic app workflow. Soon, you should see a new instance of the logic app workflow kick off, call out to your Azure Function, and then successfully send the data to Azure Monitor.
130130

131131
>[!NOTE]
132132
>It can take up to 30 minutes for the data to appear in Azure Monitor the first time you send a new data type.
@@ -151,12 +151,12 @@ The output should show the two data sources now joined.
151151
## Suggested improvements for a production pipeline
152152
This article presented a working prototype, the logic behind which can be applied towards a true production-quality solution. For such a production-quality solution, the following improvements are recommended:
153153

154-
* Add error handling and retry logic in your Logic App and Function.
154+
* Add error handling and retry logic in your logic app workflow and Function.
155155
* Add logic to ensure that the 30MB/single Log Analytics Ingestion API call limit is not exceeded. Split the data into smaller segments if needed.
156156
* Set up a clean-up policy on your blob storage. Once successfully sent to the Log Analytics workspace, unless you’d like to keep the raw data available for archival purposes, there is no reason to continue storing it.
157157
* Verify monitoring is enabled across the full pipeline, adding trace points and alerts as appropriate.
158-
* Leverage source control to manage the code for your function and Logic App.
159-
* Ensure that a proper change management policy is followed, such that if the schema changes, the function and Logic Apps are modified accordingly.
158+
* Leverage source control to manage the code for your function and logic app workflow.
159+
* Ensure that a proper change management policy is followed, such that if the schema changes, the function and logic app are modified accordingly.
160160
* If you are uploading multiple different data types, segregate them into individual folders within your blob container, and create logic to fan the logic out based on the data type.
161161

162162

articles/azure-monitor/logs/logicapp-flow-connector.md

Lines changed: 12 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ ms.date: 03/22/2022
1313
# Azure Monitor Logs connector for Logic Apps and Power Automate
1414
[Azure Logic Apps](../../logic-apps/index.yml) and [Power Automate](https://make.powerautomate.com) allow you to create automated workflows using hundreds of actions for various services. The Azure Monitor Logs connector allows you to build workflows that retrieve data from a Log Analytics workspace or an Application Insights application in Azure Monitor. This article describes the actions included with the connector and provides a walkthrough to build a workflow using this data.
1515

16-
For example, you can create a logic app to use Azure Monitor log data in an email notification from Office 365, create a bug in Azure DevOps, or post a Slack message. You can trigger a workflow by a simple schedule or from some action in a connected service such as when a mail or a tweet is received.
16+
For example, you can create a logic app workflow to use Azure Monitor log data in an email notification from Office 365, create a bug in Azure DevOps, or post a Slack message. You can trigger a workflow by a simple schedule or from some action in a connected service such as when a mail or a tweet is received.
1717

1818
## Connector limits
1919
The Azure Monitor Logs connector has these limits:
@@ -44,25 +44,24 @@ The following tutorial illustrates the use of the Azure Monitor Logs connector i
4444
### Create a Logic App
4545

4646
1. Go to **Logic Apps** in the Azure portal and select **Add**.
47-
1. Select a **Subscription**, **Resource group**, and **Region** to store the new logic app and then give it a unique name. You can turn on the **Log Analytics** setting to collect information about runtime data and events as described in [Set up Azure Monitor logs and collect diagnostics data for Azure Logic Apps](../../logic-apps/monitor-logic-apps-log-analytics.md). This setting isn't required for using the Azure Monitor Logs connector.
48-
49-
![Screenshot that shows the Basics tab on the Logic App creation screen.](media/logicapp-flow-connector/create-logic-app.png)
47+
1. Select a **Subscription**, **Resource group**, and **Region** to store the new logic app and then give it a unique name. You can turn on the **Log Analytics** setting to collect information about runtime data and events as described in [Set up Azure Monitor logs and collect diagnostics data for Azure Logic Apps](../../logic-apps/monitor-workflows-collect-diagnostic-data.md). This setting isn't required for using the Azure Monitor Logs connector.
5048

49+
![Screenshot that shows the Basics tab on the logic app creation screen.](media/logicapp-flow-connector/create-logic-app.png)
5150

5251
1. Select **Review + create** > **Create**.
5352
1. When the deployment is complete, select **Go to resource** to open the **Logic Apps Designer**.
5453

55-
### Create a trigger for the logic app
54+
### Create a trigger for the logic app workflow
5655
1. Under **Start with a common trigger**, select **Recurrence**.
5756

58-
This creates a logic app that automatically runs at a regular interval.
57+
This creates a logic app workflow that automatically runs at a regular interval.
5958

6059
1. In the **Frequency** box of the action, select **Day** and in the **Interval** box, enter **1** to run the workflow once per day.
6160

6261
![Screenshot that shows the Logic Apps Designer "Recurrence" window on which you can set the interval and frequency at which the logic app runs.](media/logicapp-flow-connector/recurrence-action.png)
6362

6463
## Walkthrough: Mail visualized results
65-
This tutorial shows how to create a logic app that sends the results of an Azure Monitor log query by email.
64+
This tutorial shows how to create a logic app workflow that sends the results of an Azure Monitor log query by email.
6665

6766
### Add Azure Monitor Logs action
6867
1. Select **+ New step** to add an action that runs after the recurrence action.
@@ -112,23 +111,23 @@ This tutorial shows how to create a logic app that sends the results of an Azure
112111
113112
1. Specify the email address of a recipient in the **To** window and a subject for the email in **Subject**.
114113
115-
![Screenshot of the settings for the new Send an email (V2) action, showing the subject line and email recepients being defined.](media/logicapp-flow-connector/mail-action.png)
114+
![Screenshot of the settings for the new Send an email (V2) action, showing the subject line and email recipients being defined.](media/logicapp-flow-connector/mail-action.png)
116115
117-
### Save and test your logic app
118-
1. Select **Save** and then **Run** to perform a test run of the logic app.
116+
### Save and test your workflow
117+
1. Select **Save** and then **Run** to perform a test run of the workflow.
119118
120119
![Save and run](media/logicapp-flow-connector/save-run.png)
121120
122121
123-
When the logic app completes, check the mail of the recipient that you specified. You should receive a mail with a body similar to the following:
122+
When the workflow completes, check the mail of the recipient that you specified. You should receive a mail with a body similar to the following:
124123
125124
![An image of a sample email.](media/logicapp-flow-connector/sample-mail.png)
126125
127126
> [!NOTE]
128-
> The log app generates an email with a JPG file that depicts the query result set. If your query doesn't return results, the logic app won't create a JPG file.
127+
> The workflow generates an email with a JPG file that depicts the query result set. If your query doesn't return results, the workflow won't create a JPG file.
129128
130129
## Next steps
131130
132131
- Learn more about [log queries in Azure Monitor](./log-query-overview.md).
133-
- Learn more about [Logic Apps](../../logic-apps/index.yml)
132+
- Learn more about [Azure Logic Apps](../../logic-apps/index.yml)
134133
- Learn more about [Power Automate](https://make.powerautomate.com).

0 commit comments

Comments
 (0)