Skip to content

Commit df26499

Browse files
authored
[Feature] Handle schema attribute in databricks_pipeline (#4137)
## Changes <!-- Summary of your changes that are easy to understand --> The new `schema` attribute was added to support direct publishing mode. Besides documentation we were need to add TF schema customization as it conflicts with the `target` attribute. ## Tests <!-- How is this tested? Please see the checklist below and also describe any other relevant tests --> - [x] `make test` run locally - [x] relevant change in `docs/` folder - [ ] covered with integration tests in `internal/acceptance` - [x] relevant acceptance tests are passing - [ ] using Go SDK
1 parent fa3c3de commit df26499

File tree

3 files changed

+4
-2
lines changed

3 files changed

+4
-2
lines changed

docs/resources/job.md

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -372,7 +372,6 @@ This block describes the queue settings of the job:
372372
* `periodic` - (Optional) configuration block to define a trigger for Periodic Triggers consisting of the following attributes:
373373
* `interval` - (Required) Specifies the interval at which the job should run. This value is required.
374374
* `unit` - (Required) Options are {"DAYS", "HOURS", "WEEKS"}.
375-
376375
* `file_arrival` - (Optional) configuration block to define a trigger for [File Arrival events](https://learn.microsoft.com/en-us/azure/databricks/workflows/jobs/file-arrival-triggers) consisting of following attributes:
377376
* `url` - (Required) URL to be monitored for file arrivals. The path must point to the root or a subpath of the external location. Please note that the URL must have a trailing slash character (`/`).
378377
* `min_time_between_triggers_seconds` - (Optional) If set, the trigger starts a run only after the specified amount of time passed since the last time the trigger fired. The minimum allowed value is 60 seconds.

docs/resources/pipeline.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -80,7 +80,8 @@ The following arguments are supported:
8080
* `photon` - A flag indicating whether to use Photon engine. The default value is `false`.
8181
* `serverless` - An optional flag indicating if serverless compute should be used for this DLT pipeline. Requires `catalog` to be set, as it could be used only with Unity Catalog.
8282
* `catalog` - The name of catalog in Unity Catalog. *Change of this parameter forces recreation of the pipeline.* (Conflicts with `storage`).
83-
* `target` - The name of a database (in either the Hive metastore or in a UC catalog) for persisting pipeline output data. Configuring the target setting allows you to view and query the pipeline output data from the Databricks UI.
83+
* `target` - (Optional, String, Conflicts with `schema`) The name of a database (in either the Hive metastore or in a UC catalog) for persisting pipeline output data. Configuring the target setting allows you to view and query the pipeline output data from the Databricks UI.
84+
* `schema` - (Optional, String, Conflicts with `target`) The default schema (database) where tables are read from or published to. The presence of this attribute implies that the pipeline is in direct publishing mode.
8485
* `edition` - optional name of the [product edition](https://docs.databricks.com/data-engineering/delta-live-tables/delta-live-tables-concepts.html#editions). Supported values are: `CORE`, `PRO`, `ADVANCED` (default). Not required when `serverless` is set to `true`.
8586
* `channel` - optional name of the release channel for Spark version used by DLT pipeline. Supported values are: `CURRENT` (default) and `PREVIEW`.
8687
* `budget_policy_id` - optional string specifying ID of the budget policy for this DLT pipeline.

pipelines/resource_pipeline.go

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -246,6 +246,8 @@ func (Pipeline) CustomizeSchema(s *common.CustomizableSchema) *common.Customizab
246246
s.SchemaPath("storage").SetConflictsWith([]string{"catalog"})
247247
s.SchemaPath("catalog").SetConflictsWith([]string{"storage"})
248248
s.SchemaPath("ingestion_definition", "connection_name").SetConflictsWith([]string{"ingestion_definition.0.ingestion_gateway_id"})
249+
s.SchemaPath("target").SetConflictsWith([]string{"schema"})
250+
s.SchemaPath("schema").SetConflictsWith([]string{"target"})
249251

250252
// MinItems fields
251253
s.SchemaPath("library").SetMinItems(1)

0 commit comments

Comments
 (0)