You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/synapse-analytics/quickstart-transform-data-using-spark-job-definition.md
+4-5Lines changed: 4 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,12 +1,12 @@
1
1
---
2
-
title: "Quickstart: Transform data using Apache Spark job definition"
2
+
title: 'Quickstart: Transform data using Apache Spark job definition'
3
3
description: This tutorial provides step-by-step instructions for using Azure Synapse Analytics to transform data with Apache Spark job definition.
4
4
author: juluczni
5
5
ms.author: juluczni
6
6
ms.reviewer: makromer
7
7
ms.service: azure-synapse-analytics
8
8
ms.subservice: pipeline
9
-
ms.topic: conceptual
9
+
ms.topic: quickstart
10
10
ms.date: 02/15/2022
11
11
---
12
12
@@ -20,7 +20,6 @@ In this quickstart, you'll use Azure Synapse Analytics to create a pipeline usin
20
20
***Azure Synapse workspace**: Create a Synapse workspace using the Azure portal following the instructions in [Quickstart: Create a Synapse workspace](quickstart-create-workspace.md).
21
21
***Apache Spark job definition**: Create an Apache Spark job definition in the Synapse workspace following the instructions in [Tutorial: Create Apache Spark job definition in Synapse Studio](spark/apache-spark-job-definitions.md).
22
22
23
-
24
23
### Navigate to the Synapse Studio
25
24
26
25
After your Azure Synapse workspace is created, you have two ways to open Synapse Studio:
@@ -93,7 +92,7 @@ On this panel, you can reference to the Spark job definition to run.
93
92
|Max executors| Max number of executors to be allocated in the specified Spark pool for the job.|
94
93
|Driver size| Number of cores and memory to be used for driver given in the specified Apache Spark pool for the job.|
95
94
|Spark configuration| Specify values for Spark configuration properties listed in the topic: Spark Configuration - Application properties. Users can use default configuration and customized configuration. |
* You can add dynamic content by clicking the **Add Dynamic Content** button or by pressing the shortcut key <kbd>Alt</kbd>+<kbd>Shift</kbd>+<kbd>D</kbd>. In the **Add Dynamic Content** page, you can use any combination of expressions, functions, and system variables to add to dynamic content.
@@ -106,7 +105,7 @@ You can add properties for Apache Spark job definition activity in this panel.
0 commit comments