You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/synapse-analytics/spark/apache-spark-azure-log-analytics.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -288,7 +288,7 @@ You can follow below steps to create a managed private endpoint connection to Az
288
288
|`spark.synapse.diagnostic.emitter.<destination>.type`| Required. Built-in destination type. To enable Azure Log Analytics destination, AzureLogAnalytics needs to be included in this field.|
289
289
|`spark.synapse.diagnostic.emitter.<destination>.categories`| Optional. The comma-separated selected log categories. Available values include `DriverLog`, `ExecutorLog`, `EventLog`, `Metrics`. If not set, the default value is **all** categories. |
290
290
|`spark.synapse.diagnostic.emitter.<destination>.workspaceId`| Required. To enable Azure Log Analytics destination, workspaceId needs to be included in this field. |
291
-
|`spark.synapse.diagnostic.emitter.<destination>.secret`| Optional. The secret (Log Aanalytics key) content. To find this, in the Azure portal, go to Azure Log Analytics workspace > Agents > Primary key. |
291
+
|`spark.synapse.diagnostic.emitter.<destination>.secret`| Optional. The secret (Log Analytics key) content. To find this, in the Azure portal, go to Azure Log Analytics workspace > Agents > Primary key. |
292
292
|`spark.synapse.diagnostic.emitter.<destination>.secret.keyVault`| Required if `.secret` is not specified. The [Azure Key vault](/azure/key-vault/general/overview) name where the secret (AccessKey or SAS) is stored. |
293
293
|`spark.synapse.diagnostic.emitter.<destination>.secret.keyVault.secretName`| Required if `.secret.keyVault` is specified. The Azure Key vault secret name where the secret is stored. |
294
294
|`spark.synapse.diagnostic.emitter.<destination>.secret.keyVault.linkedService`| Optional. The Azure Key vault linked service name. When enabled in Synapse pipeline, this is necessary to obtain the secret from Azure Key vault. (Make sure the MSI has read access to the Azure Key vault). |
Copy file name to clipboardExpand all lines: articles/synapse-analytics/spark/azure-synapse-diagnostic-emitters-azure-eventhub.md
+9-9Lines changed: 9 additions & 9 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,6 +1,6 @@
1
1
---
2
2
title: Collect your Apache Spark applications logs and metrics using Azure Event Hubs
3
-
description: In this tutorial, you learn how to use the Synapse Apache Spark diagnostic emitter extension to emit Apache Spark applications’ logs, event logs and metrics to your Azure Event Hubs.
3
+
description: In this tutorial, you learn how to use the Synapse Apache Spark diagnostic emitter extension to emit Apache Spark applications' logs, event logs, and metrics to your Azure Event Hubs.
4
4
author: hrasheed-msft
5
5
ms.author: jejiang
6
6
@@ -14,7 +14,7 @@ ms.date: 08/31/2021
14
14
15
15
The Synapse Apache Spark diagnostic emitter extension is a library that enables the Apache Spark application to emit the logs, event logs, and metrics to one or more destinations, including Azure Log Analytics, Azure Storage, and Azure Event Hubs.
16
16
17
-
In this tutorial, you learn how to use the Synapse Apache Spark diagnostic emitter extension to emit Apache Spark applications’ logs, event logs, and metrics to your Azure Event Hubs.
17
+
In this tutorial, you learn how to use the Synapse Apache Spark diagnostic emitter extension to emit Apache Spark applications' logs, event logs, and metrics to your Azure Event Hubs.
Fill in the following parameters in the configuration file: `<connection-string>`.
38
-
For more description of the parameters, you can refer to [Azure Event Hubs configurations](#available-configurations).
38
+
For more descriptions of the parameters, you can refer to [Azure Event Hubs configurations](#available-configurations).
39
39
40
40
### Step 3: Upload the Apache Spark configuration file to Apache Spark pool
41
41
@@ -51,21 +51,21 @@ For more description of the parameters, you can refer to [Azure Event Hubs confi
51
51
|`spark.synapse.diagnostic.emitter.<destination>.type`| Required. Built-in destination type. To enable Azure Event Hubs destination, the value should be `AzureEventHub`. |
52
52
|`spark.synapse.diagnostic.emitter.<destination>.categories`| Optional. The comma-separated selected log categories. Available values include `DriverLog`, `ExecutorLog`, `EventLog`, `Metrics`. If not set, the default value is **all** categories. |
53
53
|`spark.synapse.diagnostic.emitter.<destination>.secret`| Optional. The Azure Event Hubs instance connection string. This field should match this pattern `Endpoint=sb://<FQDN>/;SharedAccessKeyName=<KeyName>;SharedAccessKey=<KeyValue>;EntityPath=<PathName>`|
54
-
|`spark.synapse.diagnostic.emitter.<destination>.secret.keyVault`| Required if `.secret`is not specified. The [Azure Key vault](/azure/key-vault/general/overview) name where the secret (connection string) is stored. |
54
+
|`spark.synapse.diagnostic.emitter.<destination>.secret.keyVault`| Required if `.secret`isn't specified. The [Azure Key vault](/azure/key-vault/general/overview) (AKV) name where the secret (connection string) is stored. |
55
55
|`spark.synapse.diagnostic.emitter.<destination>.secret.keyVault.secretName`| Required if `.secret.keyVault` is specified. The Azure Key vault secret name where the secret (connection string) is stored. |
56
-
|`spark.synapse.diagnostic.emitter.<destination>.secret.keyVault.linkedService`| Optional. The Azure Key vault linked service name. When enabled in Synapse pipeline, this is necessary to obtain the secret from AKV. (Please make sure MSI has read permission on the AKV). |
56
+
|`spark.synapse.diagnostic.emitter.<destination>.secret.keyVault.linkedService`| Optional. The Azure Key vault linked service name. When enabled in Synapse pipeline, this is required to obtain the secret from AKV. (Make sure managed service identity (MSI) has read permission on the AKV). |
57
57
|`spark.synapse.diagnostic.emitter.<destination>.filter.eventName.match`| Optional. The comma-separated spark event names, you can specify which events to collect. For example: `SparkListenerApplicationStart,SparkListenerApplicationEnd`|
58
-
|`spark.synapse.diagnostic.emitter.<destination>.filter.loggerName.match`| Optional. The comma-separated log4j logger names, you can specify which logs to collect. For example: `org.apache.spark.SparkContext,org.example.Logger`|
58
+
|`spark.synapse.diagnostic.emitter.<destination>.filter.loggerName.match`| Optional. The comma-separated Log4j logger names, you can specify which logs to collect. For example: `org.apache.spark.SparkContext,org.example.Logger`|
59
59
|`spark.synapse.diagnostic.emitter.<destination>.filter.metricName.match`| Optional. The comma-separated spark metric name suffixes, you can specify which metrics to collect. For example: `jvm.heap.used`|
60
60
61
61
62
62
> [!NOTE]
63
63
>
64
-
> The Azure Eventhub instance connection string should always contains the`EntityPath`, which is the name of the Azure Event Hubs instance.
64
+
> The Azure Event hubs instance connection string should always contain`EntityPath`, which is the name of the Azure Event Hubs instance.
65
65
66
66
## Log data sample
67
67
68
-
Here is a sample log record in JSON format:
68
+
Here's a sample log record in JSON format:
69
69
70
70
```json
71
71
{
@@ -91,4 +91,4 @@ Here is a sample log record in JSON format:
91
91
92
92
## Synapse workspace with data exfiltration protection enabled
93
93
94
-
Azure Synapse Analytics workspaces support enabling data exfiltration protection for workspaces. With exfiltration protection, the logs and metrics cannot be sent out to the destination endpoints directly. You can create corresponding [managed private endpoints](../../synapse-analytics/security/synapse-workspace-managed-private-endpoints.md) for different destination endpoints or [create IP firewall rules](../../synapse-analytics/security/synapse-workspace-ip-firewall.md) in this scenario.
94
+
Azure Synapse Analytics workspaces support enabling data exfiltration protection for workspaces. With exfiltration protection, the logs and metrics can't be sent out to the destination endpoints directly. You can create corresponding [managed private endpoints](../../synapse-analytics/security/synapse-workspace-managed-private-endpoints.md) for different destination endpoints or [create IP firewall rules](../../synapse-analytics/security/synapse-workspace-ip-firewall.md) in this scenario.
0 commit comments