Skip to content

Commit 8ea96da

Browse files
authored
Merge pull request #176796 from v-lanjli/removepreview
remove preview
2 parents c3f937f + 66eefb4 commit 8ea96da

File tree

3 files changed

+25
-25
lines changed

3 files changed

+25
-25
lines changed

articles/synapse-analytics/spark/apache-spark-azure-log-analytics.md

Lines changed: 13 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
---
2-
title: Monitor Apache Spark applications with Azure Log Analytics (preview)
2+
title: Monitor Apache Spark applications with Azure Log Analytics
33
description: Learn how to enable the Synapse Studio connector for collecting and sending the Apache Spark application metrics and logs to your Log Analytics workspace.
44
services: synapse-analytics
55
author: jejiang
@@ -11,7 +11,7 @@ ms.subservice: spark
1111
ms.date: 03/25/2021
1212
ms.custom: references_regions
1313
---
14-
# Monitor Apache Spark applications with Azure Log Analytics (preview)
14+
# Monitor Apache Spark applications with Azure Log Analytics
1515

1616
In this tutorial, you learn how to enable the Synapse Studio connector that's built in to Log Analytics. You can then collect and send Apache Spark application metrics and logs to your [Log Analytics workspace](../../azure-monitor/logs/quick-create-workspace.md). Finally, you can use an Azure Monitor workbook to visualize the metrics and logs.
1717

@@ -26,7 +26,7 @@ Consult one of the following resources to create this workspace:
2626
- [Create a workspace with Azure CLI](../../azure-monitor/logs/resource-manager-workspace.md)
2727
- [Create and configure a workspace in Azure Monitor by using PowerShell](../../azure-monitor/logs/powershell-workspace-configuration.md)
2828

29-
### Step 2: Prepare a Apache Spark configuration file
29+
### Step 2: Prepare an Apache Spark configuration file
3030

3131
Use any of the following options to prepare the file.
3232

@@ -105,7 +105,7 @@ spark.synapse.logAnalytics.keyVault.linkedServiceName <LINKED_SERVICE_NAME>
105105

106106
| Configuration name | Default value | Description |
107107
| ------------------ | ------------- | ----------- |
108-
| spark.synapse.logAnalytics.enabled | false | To enable the Log Analytics sink for the Spark applications, true. Otherwise, false. |
108+
| spark.synapse.logAnalytics.enabled | False | To enable the Log Analytics sink for the Spark applications, true. Otherwise, false. |
109109
| spark.synapse.logAnalytics.workspaceId | - | The destination Log Analytics workspace ID. |
110110
| spark.synapse.logAnalytics.secret | - | The destination Log Analytics workspace secret. |
111111
| spark.synapse.logAnalytics.keyVault.linkedServiceName | - | The Key Vault linked service name for the Log Analytics workspace ID and key. |
@@ -124,7 +124,7 @@ spark.synapse.logAnalytics.keyVault.linkedServiceName <LINKED_SERVICE_NAME>
124124
[uri_suffix]: ../../azure-monitor/logs/data-collector-api.md#request-uri
125125

126126

127-
### Step 3: Upload your Apache Spark configuration to a Apache Spark pool
127+
### Step 3: Upload your Apache Spark configuration to an Apache Spark pool
128128
You can upload the configuration file to your Azure Synapse Analytics Apache Spark pool. In Synapse Studio:
129129

130130
1. Select **Manage** > **Apache Spark pools**.
@@ -140,13 +140,13 @@ You can upload the configuration file to your Azure Synapse Analytics Apache Spa
140140
>
141141
> All the Apache Spark applications submitted to the Apache Spark pool will use the configuration setting to push the Apache Spark application metrics and logs to your specified workspace.
142142
143-
## Submit a Apache Spark application and view the logs and metrics
143+
## Submit an Apache Spark application and view the logs and metrics
144144

145145
Here's how:
146146

147-
1. Submit a Apache Spark application to the Apache Spark pool configured in the previous step. You can use any of the following ways to do so:
147+
1. Submit an Apache Spark application to the Apache Spark pool configured in the previous step. You can use any of the following ways to do so:
148148
- Run a notebook in Synapse Studio.
149-
- In Synapse Studio, submit an Apache Spark batch job through a Apache Spark job definition.
149+
- In Synapse Studio, submit an Apache Spark batch job through an Apache Spark job definition.
150150
- Run a pipeline that contains Apache Spark activity.
151151

152152
1. Go to the specified Log Analytics workspace, and then view the application metrics and logs when the Apache Spark application starts to run.
@@ -240,20 +240,20 @@ Users can query to evaluate metrics and logs at a set frequency, and fire an ale
240240

241241
After the Synapse workspace is created with [data exfiltration protection](../security/workspace-data-exfiltration-protection.md) enabled.
242242

243-
when you want to enabled this feature, you need to create managed private endpoint connection requests to [Azure Monitor private link scopes (AMPLS)](../../azure-monitor/logs/private-link-security.md) in the workspace’s approved Azure AD tenants.
243+
When you want to enabled this feature, you need to create managed private endpoint connection requests to [Azure Monitor private link scopes (AMPLS)](../../azure-monitor/logs/private-link-security.md) in the workspace’s approved Azure AD tenants.
244244

245245
You can follow below steps to create a managed private endpoint connection to Azure Monitor private link scopes (AMPLS):
246246

247-
1. If there is no existing AMPLS, please follow [Azure Monitor Private Link connection setup](../../azure-monitor/logs/private-link-security.md) to create one.
247+
1. If there is no existing AMPLS, you can follow [Azure Monitor Private Link connection setup](../../azure-monitor/logs/private-link-security.md) to create one.
248248
2. Navigate to your AMPLS in Azure portal, on the **Azure Monitor Resources** page, click **Add** to add connection to your Azure Log Analytics workspace.
249-
3. Navigate to **Synapse Studio > Manage > Managed private endpoints**, click **New** button, select **Azure Monitor Private Link Scopes** and **continue**.
249+
3. Navigate to **Synapse Studio > Manage > Managed private endpoints**, click **New** button, select **Azure Monitor Private Link Scopes**, and **continue**.
250250
> [!div class="mx-imgBorder"]
251251
> ![Create AMPLS managed private endpoint 1](./media/apache-spark-azure-log-analytics/create-ampls-private-endpoint-1.png)
252-
4. Choose your Azure Monitor Private Link Scope you just created, and click **Create** button.
252+
4. Choose your Azure Monitor Private Link Scope you created, and click **Create** button.
253253
> [!div class="mx-imgBorder"]
254254
> ![Create AMPLS managed private endpoint 2](./media/apache-spark-azure-log-analytics/create-ampls-private-endpoint-2.png)
255255
5. Wait a few minutes for private endpoint provisioning.
256-
6. Navigate to your AMPLS in Azure portal again, on the **Private Endpoint connections** page, select the connection just provisioned and **Approve**.
256+
6. Navigate to your AMPLS in Azure portal again, on the **Private Endpoint connections** page, select the connection provisioned and **Approve**.
257257

258258
> [!NOTE]
259259
> - The AMPLS object has a number of limits you should consider when planning your Private Link setup. See [AMPLS limits](../../azure-monitor/logs/private-link-security.md) for a deeper review of these limits.

articles/synapse-analytics/spark/azure-synapse-diagnostic-emitters-azure-eventhub.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
---
2-
title: Collect your Apache Spark applications logs and metrics using Azure Event Hubs (preview)
2+
title: Collect your Apache Spark applications logs and metrics using Azure Event Hubs
33
description: In this tutorial, you learn how to use the Synapse Apache Spark diagnostic emitter extension to emit Apache Spark applications’ logs, event logs and metrics to your Azure Event Hubs.
44
services: synapse-analytics
55
author: hrasheed-msft
@@ -11,11 +11,11 @@ ms.subservice: spark
1111
ms.date: 08/31/2021
1212
---
1313

14-
# Collect your Apache Spark applications logs and metrics using Azure Event Hubs (preview)
14+
# Collect your Apache Spark applications logs and metrics using Azure Event Hubs
1515

1616
The Synapse Apache Spark diagnostic emitter extension is a library that enables the Apache Spark application to emit the logs, event logs, and metrics to one or more destinations, including Azure Log Analytics, Azure Storage, and Azure Event Hubs.
1717

18-
In this tutorial, you learn how to use the Synapse Apache Spark diagnostic emitter extension to emit Apache Spark applications’ logs, event logs and metrics to your Azure Event Hubs.
18+
In this tutorial, you learn how to use the Synapse Apache Spark diagnostic emitter extension to emit Apache Spark applications’ logs, event logs, and metrics to your Azure Event Hubs.
1919

2020
## Collect logs and metrics to Azure Event Hubs
2121

@@ -24,7 +24,7 @@ In this tutorial, you learn how to use the Synapse Apache Spark diagnostic emitt
2424
To collect diagnostic logs and metrics to Azure Event Hubs, you can use existing Azure Event Hubs instance.
2525
Or if you don't have one, you can [create an event hub](../../event-hubs/event-hubs-create.md).
2626

27-
### Step 2: Create a Apache Spark configuration file
27+
### Step 2: Create an Apache Spark configuration file
2828

2929
Create a `diagnostic-emitter-azure-event-hub-conf.txt` and copy following contents to the file. Or download a [sample template file](https://go.microsoft.com/fwlink/?linkid=2169375) for Apache Spark pool configuration.
3030

@@ -36,7 +36,7 @@ spark.synapse.diagnostic.emitter.MyDestination1.secret <connection-string>
3636
```
3737

3838
Fill in the following parameters in the configuration file: `<connection-string>`.
39-
For more description of the parameters, please refer to [Azure EventHub configurations](#available-configurations)
39+
For more description of the parameters, you can refer to [Azure EventHub configurations](#available-configurations)
4040

4141
### Step 3: Upload the Apache Spark configuration file to Apache Spark pool
4242

@@ -91,7 +91,7 @@ Here is a sample log record in JSON format:
9191

9292
## Synapse workspace with data exfiltration protection enabled
9393

94-
Azure Synapse Analytics workspaces support enabling data exfiltration protection for workspaces. With exfiltration protection, the logs and metrics can not be sent out to the destination endpoints directly. You can create corresponding [managed private endpoints](../../synapse-analytics/security/synapse-workspace-managed-private-endpoints.md) for different destination endpoints or [create IP firewall rules](../../synapse-analytics/security/synapse-workspace-ip-firewall.md) in this scenario.
94+
Azure Synapse Analytics workspaces support enabling data exfiltration protection for workspaces. With exfiltration protection, the logs and metrics cannot be sent out to the destination endpoints directly. You can create corresponding [managed private endpoints](../../synapse-analytics/security/synapse-workspace-managed-private-endpoints.md) for different destination endpoints or [create IP firewall rules](../../synapse-analytics/security/synapse-workspace-ip-firewall.md) in this scenario.
9595

9696

9797

articles/synapse-analytics/spark/azure-synapse-diagnostic-emitters-azure-storage.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
---
2-
title: Collect your Apache Spark applications logs and metrics using Azure Storage account(preview)
2+
title: Collect your Apache Spark applications logs and metrics using Azure Storage account
33
description: This article shows how to use the Synapse Spark diagnostic emitter extension to collect logs, event logs and metrics.cluster and learn how to integrate the Grafana dashboards.
44
services: synapse-analytics
55
author: hrasheed-msft
@@ -11,11 +11,11 @@ ms.subservice: spark
1111
ms.date: 08/31/2021
1212
---
1313

14-
# Collect your Apache Spark applications logs and metrics using Azure Storage account(preview)
14+
# Collect your Apache Spark applications logs and metrics using Azure Storage account
1515

1616
The Synapse Apache Spark diagnostic emitter extension is a library that enables the Apache Spark application to emit the logs, event logs, and metrics to one or more destinations, including Azure Log Analytics, Azure Storage, and Azure Event Hubs.
1717

18-
In this tutorial, you learn how to use the Synapse Apache Spark diagnostic emitter extension to emit Apache Spark applications’ logs, event logs and metrics to your Azure storage account.
18+
In this tutorial, you learn how to use the Synapse Apache Spark diagnostic emitter extension to emit Apache Spark applications’ logs, event logs, and metrics to your Azure storage account.
1919

2020
## Collect logs and metrics to storage account
2121

@@ -37,7 +37,7 @@ spark.synapse.diagnostic.emitter.MyDestination1.secret <storage-access-key>
3737
```
3838

3939
Fill in the following parameters in the configuration file: `<my-blob-storage>`, `<container-name>`, `<folder-name>`, `<storage-access-key>`.
40-
For more description of the parameters, please refer to [Azure Storage configurations](#available-configurations)
40+
For more description of the parameters, you can refer to [Azure Storage configurations](#available-configurations)
4141

4242
### Step 3: Upload the Apache Spark configuration file to Spark pool
4343

@@ -50,7 +50,7 @@ For more description of the parameters, please refer to [Azure Storage configura
5050

5151
After you submit a job to the configured Apache Spark pool, you should be able to see the logs and metrics files in destination storage account.
5252
The logs will be placed in corresponding paths according to different applications by `<workspaceName>.<sparkPoolName>.<livySessionId>`.
53-
All the logs files will be in JSON lines format (also called newline-delimited JSON, ndjson), which is very convenient for data processing.
53+
All the logs files will be in JSON lines format (also called newline-delimited JSON, ndjson), which is convenient for data processing.
5454

5555
## Available configurations
5656

@@ -106,6 +106,6 @@ Azure Synapse Analytics workspaces support enabling data exfiltration protection
106106
> [!div class="mx-imgBorder"]
107107
> ![Create managed private endpoint 2](./media/azure-synapse-diagnostic-emitters-azure-storage/create-private-endpoint-2.png)
108108
3. Wait a few minutes for private endpoint provisioning.
109-
4. Navigate to your storage account in Azure portal, on the **Networking** > **Private Endpoint connections** page, select the connection just provisioned and **Approve**.
109+
4. Navigate to your storage account in Azure portal, on the **Networking** > **Private Endpoint connections** page, select the connection provisioned and **Approve**.
110110

111111

0 commit comments

Comments
 (0)