You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The Azure Databricks Job Activity in a [pipeline](concepts-pipelines-activities.md) runs serverless Databricks jobs in your Azure Databricks workspace. This article builds on the [data transformation activities](transform-data.md) article, which presents a general overview of data transformation and the supported transformation activities. Azure Databricks is a managed platform for running Apache Spark.
19
19
@@ -25,7 +25,7 @@ To use a Job activity for Azure Databricks in a pipeline, complete the following
25
25
26
26
1. Search for _Job_ in the pipeline Activities pane, and drag a Job activity to the pipeline canvas.
27
27
1. Select the new Job activity on the canvas if it isn't already selected.
28
-
1. Select the **Azure Databricks** tab to select or create a new Azure Databricks linked service that execute the Job activity.
28
+
1. Select the **Azure Databricks** tab to select or create a new Azure Databricks linked service that executes the Job activity.
29
29
1. Select the **Settings** tab and specify the job to be executed on Azure Databricks, optional base parameters to be passed to the job, and any other libraries to be installed on the cluster to execute the job.
30
30
31
31
## Databricks Job activity definition
@@ -63,7 +63,7 @@ definition:
63
63
|description|Text describing what the activity does.|No|
64
64
|type|For Databricks Job Activity, the activity type is DatabricksJob.|Yes|
65
65
|linkedServiceName|Name of the Databricks Linked Service on which the Databricks job runs. To learn about this linked service, see [Compute linked services](compute-linked-services.md) article.|Yes|
66
-
|jobId|The id of the job to be run in the Databricks Workspace.|Yes|
66
+
|jobId|The ID of the job to be run in the Databricks Workspace.|Yes|
67
67
|jobParameters|An array of Key-Value pairs. Job parameters can be used for each activity run. If the job takes a parameter that isn't specified, the default value from the job will be used. Find more on parameters in [Databricks Jobs](https://docs.databricks.com/api/latest/jobs.html#jobsparampair).|No|
0 commit comments