You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/synapse-analytics/spark/apache-spark-azure-log-analytics.md
+13-13Lines changed: 13 additions & 13 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,5 +1,5 @@
1
1
---
2
-
title: Monitor Apache Spark applications with Azure Log Analytics (preview)
2
+
title: Monitor Apache Spark applications with Azure Log Analytics
3
3
description: Learn how to enable the Synapse Studio connector for collecting and sending the Apache Spark application metrics and logs to your Log Analytics workspace.
4
4
services: synapse-analytics
5
5
author: jejiang
@@ -11,7 +11,7 @@ ms.subservice: spark
11
11
ms.date: 03/25/2021
12
12
ms.custom: references_regions
13
13
---
14
-
# Monitor Apache Spark applications with Azure Log Analytics (preview)
14
+
# Monitor Apache Spark applications with Azure Log Analytics
15
15
16
16
In this tutorial, you learn how to enable the Synapse Studio connector that's built in to Log Analytics. You can then collect and send Apache Spark application metrics and logs to your [Log Analytics workspace](../../azure-monitor/logs/quick-create-workspace.md). Finally, you can use an Azure Monitor workbook to visualize the metrics and logs.
17
17
@@ -26,7 +26,7 @@ Consult one of the following resources to create this workspace:
26
26
-[Create a workspace with Azure CLI](../../azure-monitor/logs/resource-manager-workspace.md)
27
27
-[Create and configure a workspace in Azure Monitor by using PowerShell](../../azure-monitor/logs/powershell-workspace-configuration.md)
28
28
29
-
### Step 2: Prepare a Apache Spark configuration file
29
+
### Step 2: Prepare an Apache Spark configuration file
30
30
31
31
Use any of the following options to prepare the file.
### Step 3: Upload your Apache Spark configuration to a Apache Spark pool
127
+
### Step 3: Upload your Apache Spark configuration to an Apache Spark pool
128
128
You can upload the configuration file to your Azure Synapse Analytics Apache Spark pool. In Synapse Studio:
129
129
130
130
1. Select **Manage** > **Apache Spark pools**.
@@ -140,13 +140,13 @@ You can upload the configuration file to your Azure Synapse Analytics Apache Spa
140
140
>
141
141
> All the Apache Spark applications submitted to the Apache Spark pool will use the configuration setting to push the Apache Spark application metrics and logs to your specified workspace.
142
142
143
-
## Submit a Apache Spark application and view the logs and metrics
143
+
## Submit an Apache Spark application and view the logs and metrics
144
144
145
145
Here's how:
146
146
147
-
1. Submit a Apache Spark application to the Apache Spark pool configured in the previous step. You can use any of the following ways to do so:
147
+
1. Submit an Apache Spark application to the Apache Spark pool configured in the previous step. You can use any of the following ways to do so:
148
148
- Run a notebook in Synapse Studio.
149
-
- In Synapse Studio, submit an Apache Spark batch job through a Apache Spark job definition.
149
+
- In Synapse Studio, submit an Apache Spark batch job through an Apache Spark job definition.
150
150
- Run a pipeline that contains Apache Spark activity.
151
151
152
152
1. Go to the specified Log Analytics workspace, and then view the application metrics and logs when the Apache Spark application starts to run.
@@ -240,20 +240,20 @@ Users can query to evaluate metrics and logs at a set frequency, and fire an ale
240
240
241
241
After the Synapse workspace is created with [data exfiltration protection](../security/workspace-data-exfiltration-protection.md) enabled.
242
242
243
-
when you want to enabled this feature, you need to create managed private endpoint connection requests to [Azure Monitor private link scopes (AMPLS)](../../azure-monitor/logs/private-link-security.md) in the workspace’s approved Azure AD tenants.
243
+
When you want to enabled this feature, you need to create managed private endpoint connection requests to [Azure Monitor private link scopes (AMPLS)](../../azure-monitor/logs/private-link-security.md) in the workspace’s approved Azure AD tenants.
244
244
245
245
You can follow below steps to create a managed private endpoint connection to Azure Monitor private link scopes (AMPLS):
246
246
247
-
1. If there is no existing AMPLS, please follow [Azure Monitor Private Link connection setup](../../azure-monitor/logs/private-link-security.md) to create one.
247
+
1. If there is no existing AMPLS, you can follow [Azure Monitor Private Link connection setup](../../azure-monitor/logs/private-link-security.md) to create one.
248
248
2. Navigate to your AMPLS in Azure portal, on the **Azure Monitor Resources** page, click **Add** to add connection to your Azure Log Analytics workspace.
249
-
3. Navigate to **Synapse Studio > Manage > Managed private endpoints**, click **New** button, select **Azure Monitor Private Link Scopes** and **continue**.
249
+
3. Navigate to **Synapse Studio > Manage > Managed private endpoints**, click **New** button, select **Azure Monitor Private Link Scopes**, and **continue**.
5. Wait a few minutes for private endpoint provisioning.
256
-
6. Navigate to your AMPLS in Azure portal again, on the **Private Endpoint connections** page, select the connection just provisioned and **Approve**.
256
+
6. Navigate to your AMPLS in Azure portal again, on the **Private Endpoint connections** page, select the connection provisioned and **Approve**.
257
257
258
258
> [!NOTE]
259
259
> - The AMPLS object has a number of limits you should consider when planning your Private Link setup. See [AMPLS limits](../../azure-monitor/logs/private-link-security.md) for a deeper review of these limits.
Copy file name to clipboardExpand all lines: articles/synapse-analytics/spark/azure-synapse-diagnostic-emitters-azure-eventhub.md
+6-6Lines changed: 6 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,5 +1,5 @@
1
1
---
2
-
title: Collect your Apache Spark applications logs and metrics using Azure Event Hubs (preview)
2
+
title: Collect your Apache Spark applications logs and metrics using Azure Event Hubs
3
3
description: In this tutorial, you learn how to use the Synapse Apache Spark diagnostic emitter extension to emit Apache Spark applications’ logs, event logs and metrics to your Azure Event Hubs.
4
4
services: synapse-analytics
5
5
author: hrasheed-msft
@@ -11,11 +11,11 @@ ms.subservice: spark
11
11
ms.date: 08/31/2021
12
12
---
13
13
14
-
# Collect your Apache Spark applications logs and metrics using Azure Event Hubs (preview)
14
+
# Collect your Apache Spark applications logs and metrics using Azure Event Hubs
15
15
16
16
The Synapse Apache Spark diagnostic emitter extension is a library that enables the Apache Spark application to emit the logs, event logs, and metrics to one or more destinations, including Azure Log Analytics, Azure Storage, and Azure Event Hubs.
17
17
18
-
In this tutorial, you learn how to use the Synapse Apache Spark diagnostic emitter extension to emit Apache Spark applications’ logs, event logs and metrics to your Azure Event Hubs.
18
+
In this tutorial, you learn how to use the Synapse Apache Spark diagnostic emitter extension to emit Apache Spark applications’ logs, event logs, and metrics to your Azure Event Hubs.
19
19
20
20
## Collect logs and metrics to Azure Event Hubs
21
21
@@ -24,7 +24,7 @@ In this tutorial, you learn how to use the Synapse Apache Spark diagnostic emitt
24
24
To collect diagnostic logs and metrics to Azure Event Hubs, you can use existing Azure Event Hubs instance.
25
25
Or if you don't have one, you can [create an event hub](../../event-hubs/event-hubs-create.md).
26
26
27
-
### Step 2: Create a Apache Spark configuration file
27
+
### Step 2: Create an Apache Spark configuration file
28
28
29
29
Create a `diagnostic-emitter-azure-event-hub-conf.txt` and copy following contents to the file. Or download a [sample template file](https://go.microsoft.com/fwlink/?linkid=2169375) for Apache Spark pool configuration.
Fill in the following parameters in the configuration file: `<connection-string>`.
39
-
For more description of the parameters, please refer to [Azure EventHub configurations](#available-configurations)
39
+
For more description of the parameters, you can refer to [Azure EventHub configurations](#available-configurations)
40
40
41
41
### Step 3: Upload the Apache Spark configuration file to Apache Spark pool
42
42
@@ -91,7 +91,7 @@ Here is a sample log record in JSON format:
91
91
92
92
## Synapse workspace with data exfiltration protection enabled
93
93
94
-
Azure Synapse Analytics workspaces support enabling data exfiltration protection for workspaces. With exfiltration protection, the logs and metrics can not be sent out to the destination endpoints directly. You can create corresponding [managed private endpoints](../../synapse-analytics/security/synapse-workspace-managed-private-endpoints.md) for different destination endpoints or [create IP firewall rules](../../synapse-analytics/security/synapse-workspace-ip-firewall.md) in this scenario.
94
+
Azure Synapse Analytics workspaces support enabling data exfiltration protection for workspaces. With exfiltration protection, the logs and metrics cannot be sent out to the destination endpoints directly. You can create corresponding [managed private endpoints](../../synapse-analytics/security/synapse-workspace-managed-private-endpoints.md) for different destination endpoints or [create IP firewall rules](../../synapse-analytics/security/synapse-workspace-ip-firewall.md) in this scenario.
Copy file name to clipboardExpand all lines: articles/synapse-analytics/spark/azure-synapse-diagnostic-emitters-azure-storage.md
+6-6Lines changed: 6 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,5 +1,5 @@
1
1
---
2
-
title: Collect your Apache Spark applications logs and metrics using Azure Storage account(preview)
2
+
title: Collect your Apache Spark applications logs and metrics using Azure Storage account
3
3
description: This article shows how to use the Synapse Spark diagnostic emitter extension to collect logs, event logs and metrics.cluster and learn how to integrate the Grafana dashboards.
4
4
services: synapse-analytics
5
5
author: hrasheed-msft
@@ -11,11 +11,11 @@ ms.subservice: spark
11
11
ms.date: 08/31/2021
12
12
---
13
13
14
-
# Collect your Apache Spark applications logs and metrics using Azure Storage account(preview)
14
+
# Collect your Apache Spark applications logs and metrics using Azure Storage account
15
15
16
16
The Synapse Apache Spark diagnostic emitter extension is a library that enables the Apache Spark application to emit the logs, event logs, and metrics to one or more destinations, including Azure Log Analytics, Azure Storage, and Azure Event Hubs.
17
17
18
-
In this tutorial, you learn how to use the Synapse Apache Spark diagnostic emitter extension to emit Apache Spark applications’ logs, event logs and metrics to your Azure storage account.
18
+
In this tutorial, you learn how to use the Synapse Apache Spark diagnostic emitter extension to emit Apache Spark applications’ logs, event logs, and metrics to your Azure storage account.
3. Wait a few minutes for private endpoint provisioning.
109
-
4. Navigate to your storage account in Azure portal, on the **Networking** > **Private Endpoint connections** page, select the connection just provisioned and **Approve**.
109
+
4. Navigate to your storage account in Azure portal, on the **Networking** > **Private Endpoint connections** page, select the connection provisioned and **Approve**.
0 commit comments