You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[Feature] Improve support for new fields in databricks_pipeline (#4744)
## Changes
<!-- Summary of your changes that are easy to understand -->
New Go SDK brought new fields `root_path` and `library.glob`, so added
doc for them and also made `library.file.path`, `library.glob.include`
and `library.notebook.path` required.
## Tests
<!--
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->
- [x] `make test` run locally
- [x] relevant change in `docs/` folder
- [ ] covered with integration tests in `internal/acceptance`
- [ ] using Go SDK
- [ ] using TF Plugin Framework
Copy file name to clipboardExpand all lines: NEXT_CHANGELOG.md
+1Lines changed: 1 addition & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -7,6 +7,7 @@
7
7
### New Features and Improvements
8
8
9
9
* Support configuration of file events in `databricks_external_location`[#4749](https://github.com/databricks/terraform-provider-databricks/pull/4749).
10
+
* Improve support for new fields in `databricks_pipeline`[#4744](https://github.com/databricks/terraform-provider-databricks/pull/4744).
include = "${databricks_repo.dlt_demo.path}/subfolder/**"
60
+
}
61
+
}
62
+
57
63
continuous = false
58
64
59
65
notification {
@@ -75,7 +81,8 @@ The following arguments are supported:
75
81
*`name` - A user-friendly name for this pipeline. The name can be used to identify pipeline jobs in the UI.
76
82
*`storage` - A location on DBFS or cloud storage where output data and metadata required for pipeline execution are stored. By default, tables are stored in a subdirectory of this location. *Change of this parameter forces recreation of the pipeline.* (Conflicts with `catalog`).
77
83
*`configuration` - An optional list of values to apply to the entire pipeline. Elements must be formatted as key:value pairs.
78
-
*`library` blocks - Specifies pipeline code and required artifacts. Syntax resembles [library](cluster.md#library-configuration-block) configuration block with the addition of a special `notebook` & `file` library types that should have the `path` attribute. *Right now only the `notebook` & `file` types are supported.*
84
+
*`library` blocks - Specifies pipeline code.
85
+
*`root_path` - An optional string specifying the root path for this pipeline. This is used as the root directory when editing the pipeline in the Databricks user interface and it is added to `sys.path` when executing Python sources during pipeline execution.
79
86
*`cluster` blocks - [Clusters](cluster.md) to run the pipeline. If none is specified, pipelines will automatically select a default cluster configuration for the pipeline. *Please note that DLT pipeline clusters are supporting only subset of attributes as described in [documentation](https://docs.databricks.com/api/workspace/pipelines/create#clusters).* Also, note that `autoscale` block is extended with the `mode` parameter that controls the autoscaling algorithm (possible values are `ENHANCED` for new, enhanced autoscaling algorithm, or `LEGACY` for old algorithm).
80
87
*`continuous` - A flag indicating whether to run the pipeline continuously. The default value is `false`.
81
88
*`development` - A flag indicating whether to run the pipeline in development mode. The default value is `false`.
@@ -104,6 +111,14 @@ The following arguments are supported:
104
111
*`catalog` - (Optional, default to `catalog` defined on pipeline level) The UC catalog the event log is published under.
105
112
*`schema` - (Optional, default to `schema` defined on pipeline level) The UC schema the event log is published under.
106
113
114
+
### library block
115
+
116
+
Contains one of the blocks:
117
+
118
+
*`notebook` - specifies path to a Databricks Notebook to include as source. Actual path is specified as `path` attribute inside the block.
119
+
*`file` - specifies path to a file in Databricks Workspace to include as source. Actual path is specified as `path` attribute inside the block.
120
+
*`glob` - The unified field to include source code. Each entry should have the `include` attribute that can specify a notebook path, a file path, or a folder path that ends `/**` (to include everything from that folder). This field cannot be used together with `notebook` or `file`.
121
+
107
122
### notification block
108
123
109
124
DLT allows to specify one or more notification blocks to get notifications about pipeline's execution. This block consists of following attributes:
@@ -124,8 +139,6 @@ The configuration for a managed ingestion pipeline. These settings cannot be use
124
139
*`objects` - Required. Settings specifying tables to replicate and the destination for the replicated tables.
125
140
*`table_configuration` - Configuration settings to control the ingestion of tables. These settings are applied to all tables in the pipeline.
126
141
127
-
128
-
129
142
## Attribute Reference
130
143
131
144
In addition to all arguments above, the following attributes are exported:
0 commit comments