Skip to content

Commit 8ae24ac

Browse files
authored
[Feature] Improve support for new fields in databricks_pipeline (#4744)
## Changes <!-- Summary of your changes that are easy to understand --> New Go SDK brought new fields `root_path` and `library.glob`, so added doc for them and also made `library.file.path`, `library.glob.include` and `library.notebook.path` required. ## Tests <!-- How is this tested? Please see the checklist below and also describe any other relevant tests --> - [x] `make test` run locally - [x] relevant change in `docs/` folder - [ ] covered with integration tests in `internal/acceptance` - [ ] using Go SDK - [ ] using TF Plugin Framework
1 parent e2fad79 commit 8ae24ac

File tree

3 files changed

+22
-3
lines changed

3 files changed

+22
-3
lines changed

NEXT_CHANGELOG.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,7 @@
77
### New Features and Improvements
88

99
* Support configuration of file events in `databricks_external_location` [#4749](https://github.com/databricks/terraform-provider-databricks/pull/4749).
10+
* Improve support for new fields in `databricks_pipeline` [#4744](https://github.com/databricks/terraform-provider-databricks/pull/4744).
1011

1112
### Bug Fixes
1213

docs/resources/pipeline.md

Lines changed: 16 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -54,6 +54,12 @@ resource "databricks_pipeline" "this" {
5454
}
5555
}
5656
57+
library {
58+
glob {
59+
include = "${databricks_repo.dlt_demo.path}/subfolder/**"
60+
}
61+
}
62+
5763
continuous = false
5864
5965
notification {
@@ -75,7 +81,8 @@ The following arguments are supported:
7581
* `name` - A user-friendly name for this pipeline. The name can be used to identify pipeline jobs in the UI.
7682
* `storage` - A location on DBFS or cloud storage where output data and metadata required for pipeline execution are stored. By default, tables are stored in a subdirectory of this location. *Change of this parameter forces recreation of the pipeline.* (Conflicts with `catalog`).
7783
* `configuration` - An optional list of values to apply to the entire pipeline. Elements must be formatted as key:value pairs.
78-
* `library` blocks - Specifies pipeline code and required artifacts. Syntax resembles [library](cluster.md#library-configuration-block) configuration block with the addition of a special `notebook` & `file` library types that should have the `path` attribute. *Right now only the `notebook` & `file` types are supported.*
84+
* `library` blocks - Specifies pipeline code.
85+
* `root_path` - An optional string specifying the root path for this pipeline. This is used as the root directory when editing the pipeline in the Databricks user interface and it is added to `sys.path` when executing Python sources during pipeline execution.
7986
* `cluster` blocks - [Clusters](cluster.md) to run the pipeline. If none is specified, pipelines will automatically select a default cluster configuration for the pipeline. *Please note that DLT pipeline clusters are supporting only subset of attributes as described in [documentation](https://docs.databricks.com/api/workspace/pipelines/create#clusters).* Also, note that `autoscale` block is extended with the `mode` parameter that controls the autoscaling algorithm (possible values are `ENHANCED` for new, enhanced autoscaling algorithm, or `LEGACY` for old algorithm).
8087
* `continuous` - A flag indicating whether to run the pipeline continuously. The default value is `false`.
8188
* `development` - A flag indicating whether to run the pipeline in development mode. The default value is `false`.
@@ -104,6 +111,14 @@ The following arguments are supported:
104111
* `catalog` - (Optional, default to `catalog` defined on pipeline level) The UC catalog the event log is published under.
105112
* `schema` - (Optional, default to `schema` defined on pipeline level) The UC schema the event log is published under.
106113

114+
### library block
115+
116+
Contains one of the blocks:
117+
118+
* `notebook` - specifies path to a Databricks Notebook to include as source. Actual path is specified as `path` attribute inside the block.
119+
* `file` - specifies path to a file in Databricks Workspace to include as source. Actual path is specified as `path` attribute inside the block.
120+
* `glob` - The unified field to include source code. Each entry should have the `include` attribute that can specify a notebook path, a file path, or a folder path that ends `/**` (to include everything from that folder). This field cannot be used together with `notebook` or `file`.
121+
107122
### notification block
108123

109124
DLT allows to specify one or more notification blocks to get notifications about pipeline's execution. This block consists of following attributes:
@@ -124,8 +139,6 @@ The configuration for a managed ingestion pipeline. These settings cannot be use
124139
* `objects` - Required. Settings specifying tables to replicate and the destination for the replicated tables.
125140
* `table_configuration` - Configuration settings to control the ingestion of tables. These settings are applied to all tables in the pipeline.
126141

127-
128-
129142
## Attribute Reference
130143

131144
In addition to all arguments above, the following attributes are exported:

pipelines/resource_pipeline.go

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -212,6 +212,11 @@ func (Pipeline) CustomizeSchema(s *common.CustomizableSchema) *common.Customizab
212212
s.SchemaPath("ingestion_definition", "connection_name").SetForceNew()
213213
s.SchemaPath("ingestion_definition", "ingestion_gateway_id").SetForceNew()
214214

215+
// Required fields
216+
s.SchemaPath("library", "glob", "include").SetRequired()
217+
s.SchemaPath("library", "notebook", "path").SetRequired()
218+
s.SchemaPath("library", "file", "path").SetRequired()
219+
215220
// Computed fields
216221
s.SchemaPath("cluster", "node_type_id").SetComputed()
217222
s.SchemaPath("cluster", "driver_node_type_id").SetComputed()

0 commit comments

Comments
 (0)