Skip to content

Commit fc00007

Browse files
committed
update
1 parent f3626a3 commit fc00007

File tree

2 files changed

+10
-10
lines changed

2 files changed

+10
-10
lines changed

articles/synapse-analytics/spark/apache-spark-azure-log-analytics.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -288,7 +288,7 @@ You can follow below steps to create a managed private endpoint connection to Az
288288
| `spark.synapse.diagnostic.emitter.<destination>.type` | Required. Built-in destination type. To enable Azure Log Analytics destination, AzureLogAnalytics needs to be included in this field.|
289289
| `spark.synapse.diagnostic.emitter.<destination>.categories` | Optional. The comma-separated selected log categories. Available values include `DriverLog`, `ExecutorLog`, `EventLog`, `Metrics`. If not set, the default value is **all** categories. |
290290
| `spark.synapse.diagnostic.emitter.<destination>.workspaceId` | Required. To enable Azure Log Analytics destination, workspaceId needs to be included in this field. |
291-
| `spark.synapse.diagnostic.emitter.<destination>.secret` | Optional. The secret (Log Aanalytics key) content. To find this, in the Azure portal, go to Azure Log Analytics workspace > Agents > Primary key. |
291+
| `spark.synapse.diagnostic.emitter.<destination>.secret` | Optional. The secret (Log Analytics key) content. To find this, in the Azure portal, go to Azure Log Analytics workspace > Agents > Primary key. |
292292
| `spark.synapse.diagnostic.emitter.<destination>.secret.keyVault` | Required if `.secret` is not specified. The [Azure Key vault](/azure/key-vault/general/overview) name where the secret (AccessKey or SAS) is stored. |
293293
| `spark.synapse.diagnostic.emitter.<destination>.secret.keyVault.secretName` | Required if `.secret.keyVault` is specified. The Azure Key vault secret name where the secret is stored. |
294294
| `spark.synapse.diagnostic.emitter.<destination>.secret.keyVault.linkedService` | Optional. The Azure Key vault linked service name. When enabled in Synapse pipeline, this is necessary to obtain the secret from Azure Key vault. (Make sure the MSI has read access to the Azure Key vault). |

articles/synapse-analytics/spark/azure-synapse-diagnostic-emitters-azure-eventhub.md

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
---
22
title: Collect your Apache Spark applications logs and metrics using Azure Event Hubs
3-
description: In this tutorial, you learn how to use the Synapse Apache Spark diagnostic emitter extension to emit Apache Spark applications logs, event logs and metrics to your Azure Event Hubs.
3+
description: In this tutorial, you learn how to use the Synapse Apache Spark diagnostic emitter extension to emit Apache Spark applications' logs, event logs, and metrics to your Azure Event Hubs.
44
author: hrasheed-msft
55
ms.author: jejiang
66

@@ -14,7 +14,7 @@ ms.date: 08/31/2021
1414

1515
The Synapse Apache Spark diagnostic emitter extension is a library that enables the Apache Spark application to emit the logs, event logs, and metrics to one or more destinations, including Azure Log Analytics, Azure Storage, and Azure Event Hubs.
1616

17-
In this tutorial, you learn how to use the Synapse Apache Spark diagnostic emitter extension to emit Apache Spark applications logs, event logs, and metrics to your Azure Event Hubs.
17+
In this tutorial, you learn how to use the Synapse Apache Spark diagnostic emitter extension to emit Apache Spark applications' logs, event logs, and metrics to your Azure Event Hubs.
1818

1919
## Collect logs and metrics to Azure Event Hubs
2020

@@ -35,7 +35,7 @@ spark.synapse.diagnostic.emitter.MyDestination1.secret <connection-string>
3535
```
3636

3737
Fill in the following parameters in the configuration file: `<connection-string>`.
38-
For more description of the parameters, you can refer to [Azure Event Hubs configurations](#available-configurations).
38+
For more descriptions of the parameters, you can refer to [Azure Event Hubs configurations](#available-configurations).
3939

4040
### Step 3: Upload the Apache Spark configuration file to Apache Spark pool
4141

@@ -51,21 +51,21 @@ For more description of the parameters, you can refer to [Azure Event Hubs confi
5151
| `spark.synapse.diagnostic.emitter.<destination>.type` | Required. Built-in destination type. To enable Azure Event Hubs destination, the value should be `AzureEventHub`. |
5252
| `spark.synapse.diagnostic.emitter.<destination>.categories` | Optional. The comma-separated selected log categories. Available values include `DriverLog`, `ExecutorLog`, `EventLog`, `Metrics`. If not set, the default value is **all** categories. |
5353
| `spark.synapse.diagnostic.emitter.<destination>.secret` | Optional. The Azure Event Hubs instance connection string. This field should match this pattern `Endpoint=sb://<FQDN>/;SharedAccessKeyName=<KeyName>;SharedAccessKey=<KeyValue>;EntityPath=<PathName>` |
54-
| `spark.synapse.diagnostic.emitter.<destination>.secret.keyVault` | Required if `.secret` is not specified. The [Azure Key vault](/azure/key-vault/general/overview) name where the secret (connection string) is stored. |
54+
| `spark.synapse.diagnostic.emitter.<destination>.secret.keyVault` | Required if `.secret` isn't specified. The [Azure Key vault](/azure/key-vault/general/overview) (AKV) name where the secret (connection string) is stored. |
5555
| `spark.synapse.diagnostic.emitter.<destination>.secret.keyVault.secretName` | Required if `.secret.keyVault` is specified. The Azure Key vault secret name where the secret (connection string) is stored. |
56-
| `spark.synapse.diagnostic.emitter.<destination>.secret.keyVault.linkedService` | Optional. The Azure Key vault linked service name. When enabled in Synapse pipeline, this is necessary to obtain the secret from AKV. (Please make sure MSI has read permission on the AKV). |
56+
| `spark.synapse.diagnostic.emitter.<destination>.secret.keyVault.linkedService` | Optional. The Azure Key vault linked service name. When enabled in Synapse pipeline, this is required to obtain the secret from AKV. (Make sure managed service identity (MSI) has read permission on the AKV). |
5757
| `spark.synapse.diagnostic.emitter.<destination>.filter.eventName.match` | Optional. The comma-separated spark event names, you can specify which events to collect. For example: `SparkListenerApplicationStart,SparkListenerApplicationEnd` |
58-
| `spark.synapse.diagnostic.emitter.<destination>.filter.loggerName.match` | Optional. The comma-separated log4j logger names, you can specify which logs to collect. For example: `org.apache.spark.SparkContext,org.example.Logger` |
58+
| `spark.synapse.diagnostic.emitter.<destination>.filter.loggerName.match` | Optional. The comma-separated Log4j logger names, you can specify which logs to collect. For example: `org.apache.spark.SparkContext,org.example.Logger` |
5959
| `spark.synapse.diagnostic.emitter.<destination>.filter.metricName.match` | Optional. The comma-separated spark metric name suffixes, you can specify which metrics to collect. For example: `jvm.heap.used` |
6060

6161

6262
> [!NOTE]
6363
>
64-
> The Azure Eventhub instance connection string should always contains the `EntityPath`, which is the name of the Azure Event Hubs instance.
64+
> The Azure Event hubs instance connection string should always contain `EntityPath`, which is the name of the Azure Event Hubs instance.
6565
6666
## Log data sample
6767

68-
Here is a sample log record in JSON format:
68+
Here's a sample log record in JSON format:
6969

7070
```json
7171
{
@@ -91,4 +91,4 @@ Here is a sample log record in JSON format:
9191

9292
## Synapse workspace with data exfiltration protection enabled
9393

94-
Azure Synapse Analytics workspaces support enabling data exfiltration protection for workspaces. With exfiltration protection, the logs and metrics cannot be sent out to the destination endpoints directly. You can create corresponding [managed private endpoints](../../synapse-analytics/security/synapse-workspace-managed-private-endpoints.md) for different destination endpoints or [create IP firewall rules](../../synapse-analytics/security/synapse-workspace-ip-firewall.md) in this scenario.
94+
Azure Synapse Analytics workspaces support enabling data exfiltration protection for workspaces. With exfiltration protection, the logs and metrics can't be sent out to the destination endpoints directly. You can create corresponding [managed private endpoints](../../synapse-analytics/security/synapse-workspace-managed-private-endpoints.md) for different destination endpoints or [create IP firewall rules](../../synapse-analytics/security/synapse-workspace-ip-firewall.md) in this scenario.

0 commit comments

Comments
 (0)