Skip to content

Commit f80ad6e

Browse files
committed
update
1 parent a882b5c commit f80ad6e

File tree

1 file changed

+35
-12
lines changed

1 file changed

+35
-12
lines changed

articles/synapse-analytics/spark/apache-spark-azure-log-analytics.md

Lines changed: 35 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -39,9 +39,11 @@ Gather the following values for the spark configuration:
3939
- `<LOG_ANALYTICS_WORKSPACE_KEY>`: Log Analytics key. To find this, in the Azure portal, go to **Azure Log Analytics workspace** > **Agents** > **Primary key**.
4040

4141
```properties
42-
spark.synapse.logAnalytics.enabled true
43-
spark.synapse.logAnalytics.workspaceId <LOG_ANALYTICS_WORKSPACE_ID>
44-
spark.synapse.logAnalytics.secret <LOG_ANALYTICS_WORKSPACE_KEY>
42+
spark.synapse.diagnostic.emitters: LA
43+
spark.synapse.diagnostic.emitter.LA.type: "AzureLogAnalytics"
44+
spark.synapse.diagnostic.emitter.LA.categories: "Log,EventLog,Metrics"
45+
spark.synapse.diagnostic.emitter.LA.workspaceId: <LOG_ANALYTICS_WORKSPACE_ID>
46+
spark.synapse.diagnostic.emitter.LA.secret: <LOG_ANALYTICS_WORKSPACE_KEY>
4547
```
4648

4749
#### Option 2: Configure with Azure Key Vault
@@ -65,10 +67,12 @@ To configure Azure Key Vault to store the workspace key, follow these steps:
6567
- `<AZURE_KEY_VAULT_SECRET_KEY_NAME>` (optional): The secret name in the key vault for the workspace key. The default is `SparkLogAnalyticsSecret`.
6668

6769
```properties
68-
spark.synapse.logAnalytics.enabled true
69-
spark.synapse.logAnalytics.workspaceId <LOG_ANALYTICS_WORKSPACE_ID>
70-
spark.synapse.logAnalytics.keyVault.name <AZURE_KEY_VAULT_NAME>
71-
spark.synapse.logAnalytics.keyVault.key.secret <AZURE_KEY_VAULT_SECRET_KEY_NAME>
70+
spark.synapse.diagnostic.emitters LA
71+
spark.synapse.diagnostic.emitter.LA.type: "AzureLogAnalytics"
72+
spark.synapse.diagnostic.emitter.LA.categories: "Log,EventLog,Metrics"
73+
spark.synapse.diagnostic.emitter.LA.workspaceId: <LOG_ANALYTICS_WORKSPACE_ID>
74+
spark.synapse.diagnostic.emitter.LA.secret.keyVault: <AZURE_KEY_VAULT_NAME>
75+
spark.synapse.diagnostic.emitter.LA.secret.keyVault.secretName: <AZURE_KEY_VAULT_SECRET_KEY_NAME>
7276
```
7377

7478
> [!NOTE]
@@ -92,13 +96,16 @@ To configure a Key Vault linked service in Synapse Studio to store the workspace
9296

9397
d. Choose your key vault, and select **Create**.
9498

95-
1. Add a `spark.synapse.logAnalytics.keyVault.linkedServiceName` item to the Apache Spark configuration.
99+
1. Add a `spark.synapse.diagnostic.emitter.LA.secret.keyVault.linkedService` item to the Apache Spark configuration.
96100

97101
```properties
98-
spark.synapse.logAnalytics.enabled true
99-
spark.synapse.logAnalytics.workspaceId <LOG_ANALYTICS_WORKSPACE_ID>
100-
spark.synapse.logAnalytics.keyVault.key.secret <AZURE_KEY_VAULT_SECRET_KEY_NAME>
101-
spark.synapse.logAnalytics.keyVault.linkedServiceName <LINKED_SERVICE_NAME>
102+
spark.synapse.diagnostic.emitters LA
103+
spark.synapse.diagnostic.emitter.LA.type: "AzureLogAnalytics"
104+
spark.synapse.diagnostic.emitter.LA.categories: "Log,EventLog,Metrics"
105+
spark.synapse.diagnostic.emitter.LA.workspaceId: <LOG_ANALYTICS_WORKSPACE_ID>
106+
spark.synapse.diagnostic.emitter.LA.secret.keyVault: <AZURE_KEY_VAULT_NAME>
107+
spark.synapse.diagnostic.emitter.LA.secret.keyVault.secretName: <AZURE_KEY_VAULT_SECRET_KEY_NAME>
108+
spark.synapse.diagnostic.emitter.LA.secret.keyVault.linkedService: <AZURE_KEY_VAULT_LINKED_SERVICE>
102109
```
103110

104111
For a list of Apache Spark configurations, see [Available Apache Spark configurations](../monitor-synapse-analytics-reference.md#available-apache-spark-configurations)
@@ -245,6 +252,22 @@ You can follow below steps to create a managed private endpoint connection to Az
245252
> - The AMPLS object has a number of limits you should consider when planning your Private Link setup. See [AMPLS limits](/azure/azure-monitor/logs/private-link-security) for a deeper review of these limits.
246253
> - Check if you have [right permission](../security/synapse-workspace-access-control-overview.md) to create managed private endpoint.
247254
255+
## Available configurations
256+
257+
| Configuration | Description |
258+
| --- | --- |
259+
| `spark.synapse.diagnostic.emitters` | Required. The comma-separated destination names of diagnostic emitters. For example, `MyDest1,MyDest2` |
260+
| `spark.synapse.diagnostic.emitter.<destination>.type` | Required. Built-in destination type. To enable Azure Log Analytics destination, AzureLogAnalytics needs to be included in this field.|
261+
| `spark.synapse.diagnostic.emitter.<destination>.categories` | Optional. The comma-separated selected log categories. Available values include `DriverLog`, `ExecutorLog`, `EventLog`, `Metrics`. If not set, the default value is **all** categories. |
262+
| `spark.synapse.diagnostic.emitter.<destination>.workspaceId` | Required. To enable Azure Log Analytics destination, workspaceId needs to be included in this field. |
263+
| `spark.synapse.diagnostic.emitter.<destination>.secret` | Optional. The secret (Log Aanalytics key) content. To find this, in the Azure portal, go to Azure Log Analytics workspace > Agents > Primary key. |
264+
| `spark.synapse.diagnostic.emitter.<destination>.secret.keyVault` | Required if `.secret` is not specified. The [Azure Key vault](/azure/key-vault/general/overview) name where the secret (AccessKey or SAS) is stored. |
265+
| `spark.synapse.diagnostic.emitter.<destination>.secret.keyVault.secretName` | Required if `.secret.keyVault` is specified. The Azure Key vault secret name where the secret is stored. |
266+
| `spark.synapse.diagnostic.emitter.<destination>.secret.keyVault.linkedService` | Optional. The Azure Key vault linked service name. When enabled in Synapse pipeline, this is necessary to obtain the secret from AKV. (Please make sure MSI has read permission on the AKV). |
267+
| `spark.synapse.diagnostic.emitter.<destination>.filter.eventName.match` | Optional. The comma-separated Log4j logger names, you can specify which logs to collect. For example `SparkListenerApplicationStart,SparkListenerApplicationEnd` |
268+
| `spark.synapse.diagnostic.emitter.<destination>.filter.loggerName.match` | Optional. The comma-separated log4j logger names, you can specify which logs to collect. For example: `org.apache.spark.SparkContext,org.example.Logger` |
269+
| `spark.synapse.diagnostic.emitter.<destination>.filter.metricName.match` | Optional. The comma-separated spark metric name suffixes, you can specify which metrics to collect. For example:`jvm.heap.used` |
270+
248271
## Related content
249272

250273
- [Run a Spark application in notebook](./apache-spark-development-using-notebooks.md).

0 commit comments

Comments
 (0)