Skip to content

Commit c3ae6f3

Browse files
authored
[Doc] Document budget_policy_id in databricks_pipeline and databricks_job (#4110)
## Changes <!-- Summary of your changes that are easy to understand --> `databricks_pipeline` already has this change and for `databricks_job` we need to merge Go SDK 0.49.0 ## Tests <!-- How is this tested? Please see the checklist below and also describe any other relevant tests --> - [ ] `make test` run locally - [x] relevant change in `docs/` folder - [ ] covered with integration tests in `internal/acceptance` - [ ] relevant acceptance tests are passing - [ ] using Go SDK
1 parent 2bbf251 commit c3ae6f3

File tree

2 files changed

+2
-0
lines changed

2 files changed

+2
-0
lines changed

docs/resources/job.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -107,6 +107,7 @@ The resource supports the following arguments:
107107
* `notification_settings` - (Optional) An optional block controlling the notification settings on the job level [documented below](#notification_settings-configuration-block).
108108
* `health` - (Optional) An optional block that specifies the health conditions for the job [documented below](#health-configuration-block).
109109
* `tags` - (Optional) An optional map of the tags associated with the job. See [tags Configuration Map](#tags-configuration-map)
110+
* `budget_policy_id` - (Optional) The ID of the user-specified budget policy to use for this job. If not specified, a default budget policy may be applied when creating or modifying the job.
110111

111112
### task Configuration Block
112113

docs/resources/pipeline.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -83,6 +83,7 @@ The following arguments are supported:
8383
* `target` - The name of a database (in either the Hive metastore or in a UC catalog) for persisting pipeline output data. Configuring the target setting allows you to view and query the pipeline output data from the Databricks UI.
8484
* `edition` - optional name of the [product edition](https://docs.databricks.com/data-engineering/delta-live-tables/delta-live-tables-concepts.html#editions). Supported values are: `CORE`, `PRO`, `ADVANCED` (default). Not required when `serverless` is set to `true`.
8585
* `channel` - optional name of the release channel for Spark version used by DLT pipeline. Supported values are: `CURRENT` (default) and `PREVIEW`.
86+
* `budget_policy_id` - optional string specifying ID of the budget policy for this DLT pipeline.
8687
* `allow_duplicate_names` - Optional boolean flag. If false, deployment will fail if name conflicts with that of another pipeline. default is `false`.
8788
* `deployment` - Deployment type of this pipeline. Supports following attributes:
8889
* `kind` - The deployment method that manages the pipeline.

0 commit comments

Comments
 (0)