Skip to content

Commit 067517f

Browse files
Merge pull request #294313 from v-lanjunli/updagediagnostic
update
2 parents 3b600ca + 2d2c825 commit 067517f

File tree

1 file changed

+50
-0
lines changed

1 file changed

+50
-0
lines changed

articles/synapse-analytics/spark/apache-spark-azure-log-analytics.md

Lines changed: 50 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -44,6 +44,16 @@ spark.synapse.logAnalytics.workspaceId <LOG_ANALYTICS_WORKSPACE_ID>
4444
spark.synapse.logAnalytics.secret <LOG_ANALYTICS_WORKSPACE_KEY>
4545
```
4646

47+
Alternatively, use the following properties:
48+
49+
```properties
50+
spark.synapse.diagnostic.emitters: LA
51+
spark.synapse.diagnostic.emitter.LA.type: "AzureLogAnalytics"
52+
spark.synapse.diagnostic.emitter.LA.categories: "Log,EventLog,Metrics"
53+
spark.synapse.diagnostic.emitter.LA.workspaceId: <LOG_ANALYTICS_WORKSPACE_ID>
54+
spark.synapse.diagnostic.emitter.LA.secret: <LOG_ANALYTICS_WORKSPACE_KEY>
55+
```
56+
4757
#### Option 2: Configure with Azure Key Vault
4858

4959
> [!NOTE]
@@ -71,6 +81,17 @@ spark.synapse.logAnalytics.keyVault.name <AZURE_KEY_VAULT_NAME>
7181
spark.synapse.logAnalytics.keyVault.key.secret <AZURE_KEY_VAULT_SECRET_KEY_NAME>
7282
```
7383

84+
Alternatively, use the following properties:
85+
86+
```properties
87+
spark.synapse.diagnostic.emitters LA
88+
spark.synapse.diagnostic.emitter.LA.type: "AzureLogAnalytics"
89+
spark.synapse.diagnostic.emitter.LA.categories: "Log,EventLog,Metrics"
90+
spark.synapse.diagnostic.emitter.LA.workspaceId: <LOG_ANALYTICS_WORKSPACE_ID>
91+
spark.synapse.diagnostic.emitter.LA.secret.keyVault: <AZURE_KEY_VAULT_NAME>
92+
spark.synapse.diagnostic.emitter.LA.secret.keyVault.secretName: <AZURE_KEY_VAULT_SECRET_KEY_NAME>
93+
```
94+
7495
> [!NOTE]
7596
> You can also store the workspace ID in Key Vault. Refer to the preceding steps, and store the workspace ID with the secret name `SparkLogAnalyticsWorkspaceId`. Alternatively, you can use the configuration `spark.synapse.logAnalytics.keyVault.key.workspaceId` to specify the workspace ID secret name in Key Vault.
7697
@@ -97,10 +118,23 @@ To configure a Key Vault linked service in Synapse Studio to store the workspace
97118
```properties
98119
spark.synapse.logAnalytics.enabled true
99120
spark.synapse.logAnalytics.workspaceId <LOG_ANALYTICS_WORKSPACE_ID>
121+
spark.synapse.logAnalytics.keyVault.name <AZURE_KEY_VAULT_NAME>
100122
spark.synapse.logAnalytics.keyVault.key.secret <AZURE_KEY_VAULT_SECRET_KEY_NAME>
101123
spark.synapse.logAnalytics.keyVault.linkedServiceName <LINKED_SERVICE_NAME>
102124
```
103125

126+
Alternatively, use the following properties:
127+
128+
```properties
129+
spark.synapse.diagnostic.emitters LA
130+
spark.synapse.diagnostic.emitter.LA.type: "AzureLogAnalytics"
131+
spark.synapse.diagnostic.emitter.LA.categories: "Log,EventLog,Metrics"
132+
spark.synapse.diagnostic.emitter.LA.workspaceId: <LOG_ANALYTICS_WORKSPACE_ID>
133+
spark.synapse.diagnostic.emitter.LA.secret.keyVault: <AZURE_KEY_VAULT_NAME>
134+
spark.synapse.diagnostic.emitter.LA.secret.keyVault.secretName: <AZURE_KEY_VAULT_SECRET_KEY_NAME>
135+
spark.synapse.diagnostic.emitter.LA.secret.keyVault.linkedService: <AZURE_KEY_VAULT_LINKED_SERVICE>
136+
```
137+
104138
For a list of Apache Spark configurations, see [Available Apache Spark configurations](../monitor-synapse-analytics-reference.md#available-apache-spark-configurations)
105139

106140
### Step 3: Create an Apache Spark Configuration
@@ -245,6 +279,22 @@ You can follow below steps to create a managed private endpoint connection to Az
245279
> - The AMPLS object has a number of limits you should consider when planning your Private Link setup. See [AMPLS limits](/azure/azure-monitor/logs/private-link-security) for a deeper review of these limits.
246280
> - Check if you have [right permission](../security/synapse-workspace-access-control-overview.md) to create managed private endpoint.
247281
282+
## Available configurations
283+
284+
| Configuration | Description |
285+
| --- | --- |
286+
| `spark.synapse.diagnostic.emitters` | Required. The comma-separated destination names of diagnostic emitters. For example, `MyDest1,MyDest2` |
287+
| `spark.synapse.diagnostic.emitter.<destination>.type` | Required. Built-in destination type. To enable Azure Log Analytics destination, AzureLogAnalytics needs to be included in this field.|
288+
| `spark.synapse.diagnostic.emitter.<destination>.categories` | Optional. The comma-separated selected log categories. Available values include `DriverLog`, `ExecutorLog`, `EventLog`, `Metrics`. If not set, the default value is **all** categories. |
289+
| `spark.synapse.diagnostic.emitter.<destination>.workspaceId` | Required. To enable Azure Log Analytics destination, workspaceId needs to be included in this field. |
290+
| `spark.synapse.diagnostic.emitter.<destination>.secret` | Optional. The secret (Log Aanalytics key) content. To find this, in the Azure portal, go to Azure Log Analytics workspace > Agents > Primary key. |
291+
| `spark.synapse.diagnostic.emitter.<destination>.secret.keyVault` | Required if `.secret` is not specified. The [Azure Key vault](/azure/key-vault/general/overview) name where the secret (AccessKey or SAS) is stored. |
292+
| `spark.synapse.diagnostic.emitter.<destination>.secret.keyVault.secretName` | Required if `.secret.keyVault` is specified. The Azure Key vault secret name where the secret is stored. |
293+
| `spark.synapse.diagnostic.emitter.<destination>.secret.keyVault.linkedService` | Optional. The Azure Key vault linked service name. When enabled in Synapse pipeline, this is necessary to obtain the secret from AKV. (Please make sure MSI has read permission on the AKV). |
294+
| `spark.synapse.diagnostic.emitter.<destination>.filter.eventName.match` | Optional. The comma-separated Log4j logger names, you can specify which logs to collect. For example `SparkListenerApplicationStart,SparkListenerApplicationEnd` |
295+
| `spark.synapse.diagnostic.emitter.<destination>.filter.loggerName.match` | Optional. The comma-separated log4j logger names, you can specify which logs to collect. For example: `org.apache.spark.SparkContext,org.example.Logger` |
296+
| `spark.synapse.diagnostic.emitter.<destination>.filter.metricName.match` | Optional. The comma-separated spark metric name suffixes, you can specify which metrics to collect. For example:`jvm.heap.used` |
297+
248298
## Related content
249299

250300
- [Run a Spark application in notebook](./apache-spark-development-using-notebooks.md).

0 commit comments

Comments
 (0)