Skip to content

Commit eb4fe2a

Browse files
authored
Added source parameter for spark_python_task in databricks_job (#2157)
1 parent 5fc4be9 commit eb4fe2a

File tree

2 files changed

+6
-2
lines changed

2 files changed

+6
-2
lines changed

docs/resources/job.md

Lines changed: 5 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -143,6 +143,7 @@ Each entry in `webhook_notification` block takes a list `webhook` blocks. The fi
143143
Note that the `id` is not to be confused with the name of the alert destination. The `id` can be retrieved through the API or the URL of Databricks UI `https://<workspace host>/sql/destinations/<notification id>?o=<workspace id>`
144144

145145
Example
146+
146147
```hcl
147148
webhook_notifications {
148149
on_failure {
@@ -170,13 +171,14 @@ You can invoke Spark submit tasks only on new clusters. **In the `new_cluster` s
170171

171172
### spark_python_task Configuration Block
172173

173-
* `python_file` - (Required) The URI of the Python file to be executed. [databricks_dbfs_file](dbfs_file.md#path), cloud file URIs (e.g. `s3:/`, `abfss:/`, `gs:/`) and workspace paths are supported. For python files stored in the Databricks workspace, the path must be absolute and begin with `/Repos`. This field is required.
174+
* `python_file` - (Required) The URI of the Python file to be executed. [databricks_dbfs_file](dbfs_file.md#path), cloud file URIs (e.g. `s3:/`, `abfss:/`, `gs:/`), workspace paths and remote repository are supported. For Python files stored in the Databricks workspace, the path must be absolute and begin with `/Repos`. For files stored in a remote repository, the path must be relative. This field is required.
175+
* `source` - (Optional) Location type of the Python file, can only be `GIT`. When set to `GIT`, the Python file will be retrieved from a Git repository defined in `git_source`.
174176
* `parameters` - (Optional) (List) Command line parameters passed to the Python file.
175177

176178
### notebook_task Configuration Block
177179

178180
* `notebook_path` - (Required) The path of the [databricks_notebook](notebook.md#path) to be run in the Databricks workspace or remote repository. For notebooks stored in the Databricks workspace, the path must be absolute and begin with a slash. For notebooks stored in a remote repository, the path must be relative. This field is required.
179-
* `source` - (Optional) Location type of the notebook, can only be `WORKSPACE` or `GIT`. When set to `WORKSPACE`, the notebook will be retrieved from the local Databricks workspace. When set to `GIT`, the notebook will be retrieved from a Git repository defined in git_source. If the value is empty, the task will use `GIT` if `git_source` is defined and `WORKSPACE` otherwise.
181+
* `source` - (Optional) Location type of the notebook, can only be `WORKSPACE` or `GIT`. When set to `WORKSPACE`, the notebook will be retrieved from the local Databricks workspace. When set to `GIT`, the notebook will be retrieved from a Git repository defined in `git_source`. If the value is empty, the task will use `GIT` if `git_source` is defined and `WORKSPACE` otherwise.
180182
* `base_parameters` - (Optional) (Map) Base parameters to be used for each run of this job. If the run is initiated by a call to run-now with parameters specified, the two parameters maps will be merged. If the same key is specified in base_parameters and in run-now, the value from run-now will be used. If the notebook takes a parameter that is not specified in the job’s base_parameters or the run-now override parameters, the default value from the notebook will be used. Retrieve these parameters in a notebook using `dbutils.widgets.get`.
181183

182184
### pipeline_task Configuration Block
@@ -214,6 +216,7 @@ One of the `query`, `dashboard` or `alert` needs to be provided.
214216
* `alert` - (Optional) block consisting of single string field: `alert_id` - identifier of the Databricks SQL Alert.
215217

216218
Example
219+
217220
```hcl
218221
resource "databricks_job" "sql_aggregation_job" {
219222
name = "Example SQL Job"

jobs/resource_job.go

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -31,6 +31,7 @@ type NotebookTask struct {
3131
// SparkPythonTask contains the information for python jobs
3232
type SparkPythonTask struct {
3333
PythonFile string `json:"python_file"`
34+
Source string `json:"source,omitempty" tf:"suppress_diff"`
3435
Parameters []string `json:"parameters,omitempty"`
3536
}
3637

0 commit comments

Comments
 (0)