You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
> You can also store the workspace ID in Key Vault. Refer to the preceding steps, and store the workspace ID with the secret name `SparkLogAnalyticsWorkspaceId`. Alternatively, you can use the configuration `spark.synapse.logAnalytics.keyVault.key.workspaceId` to specify the workspace ID secret name in Key Vault.
76
97
@@ -97,10 +118,23 @@ To configure a Key Vault linked service in Synapse Studio to store the workspace
For a list of Apache Spark configurations, see [Available Apache Spark configurations](../monitor-synapse-analytics-reference.md#available-apache-spark-configurations)
105
139
106
140
### Step 3: Create an Apache Spark Configuration
@@ -245,6 +279,22 @@ You can follow below steps to create a managed private endpoint connection to Az
245
279
> - The AMPLS object has a number of limits you should consider when planning your Private Link setup. See [AMPLS limits](/azure/azure-monitor/logs/private-link-security) for a deeper review of these limits.
246
280
> - Check if you have [right permission](../security/synapse-workspace-access-control-overview.md) to create managed private endpoint.
247
281
282
+
## Available configurations
283
+
284
+
| Configuration | Description |
285
+
| --- | --- |
286
+
|`spark.synapse.diagnostic.emitters`| Required. The comma-separated destination names of diagnostic emitters. For example, `MyDest1,MyDest2`|
287
+
|`spark.synapse.diagnostic.emitter.<destination>.type`| Required. Built-in destination type. To enable Azure Log Analytics destination, AzureLogAnalytics needs to be included in this field.|
288
+
|`spark.synapse.diagnostic.emitter.<destination>.categories`| Optional. The comma-separated selected log categories. Available values include `DriverLog`, `ExecutorLog`, `EventLog`, `Metrics`. If not set, the default value is **all** categories. |
289
+
|`spark.synapse.diagnostic.emitter.<destination>.workspaceId`| Required. To enable Azure Log Analytics destination, workspaceId needs to be included in this field. |
290
+
|`spark.synapse.diagnostic.emitter.<destination>.secret`| Optional. The secret (Log Aanalytics key) content. To find this, in the Azure portal, go to Azure Log Analytics workspace > Agents > Primary key. |
291
+
|`spark.synapse.diagnostic.emitter.<destination>.secret.keyVault`| Required if `.secret` is not specified. The [Azure Key vault](/azure/key-vault/general/overview) name where the secret (AccessKey or SAS) is stored. |
292
+
|`spark.synapse.diagnostic.emitter.<destination>.secret.keyVault.secretName`| Required if `.secret.keyVault` is specified. The Azure Key vault secret name where the secret is stored. |
293
+
|`spark.synapse.diagnostic.emitter.<destination>.secret.keyVault.linkedService`| Optional. The Azure Key vault linked service name. When enabled in Synapse pipeline, this is necessary to obtain the secret from AKV. (Please make sure MSI has read permission on the AKV). |
294
+
|`spark.synapse.diagnostic.emitter.<destination>.filter.eventName.match`| Optional. The comma-separated Log4j logger names, you can specify which logs to collect. For example `SparkListenerApplicationStart,SparkListenerApplicationEnd`|
295
+
|`spark.synapse.diagnostic.emitter.<destination>.filter.loggerName.match`| Optional. The comma-separated log4j logger names, you can specify which logs to collect. For example: `org.apache.spark.SparkContext,org.example.Logger`|
296
+
|`spark.synapse.diagnostic.emitter.<destination>.filter.metricName.match`| Optional. The comma-separated spark metric name suffixes, you can specify which metrics to collect. For example:`jvm.heap.used`|
297
+
248
298
## Related content
249
299
250
300
-[Run a Spark application in notebook](./apache-spark-development-using-notebooks.md).
0 commit comments