Skip to content

Commit 1fb3125

Browse files
pieternclaude
andauthored
Bump Terraform provider to v1.98.0 (#4082)
## Changes See https://github.com/databricks/terraform-provider-databricks/releases/tag/v1.98.0 The main behavioral change in Terraform provider v1.98.0 is that **pipeline catalog updates no longer trigger recreation**. The provider relaxed the `force_new` constraint on the `catalog` attribute, allowing in-place updates via `PUT` instead of requiring `DELETE` + `POST`. This affects how bundles handle pipeline catalog changes: - Changing a pipeline's catalog now performs an update operation instead of recreate - The `prevent_destroy` lifecycle no longer blocks catalog changes (since no recreation occurs) - Tests were updated to use `storage` location changes (which still trigger recreation) to properly exercise `prevent_destroy` behavior ## Why Nominal upgrade to match the SDK version. --------- Co-authored-by: Claude <[email protected]>
1 parent 1557af9 commit 1fb3125

File tree

53 files changed

+398
-361
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

53 files changed

+398
-361
lines changed

NEXT_CHANGELOG.md

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -8,10 +8,12 @@
88
### CLI
99
* Introduce `databricks apps logs` command to tail app logs from the CLI ([#3908](https://github.com/databricks/cli/pull/3908))
1010

11-
### Dependency updates
12-
1311
### Bundles
1412
* Add support for alerts to DABs ([#4004](https://github.com/databricks/cli/pull/4004))
1513
* Allow `file://` URIs in job libraries to reference runtime filesystem paths (e.g., JARs pre-installed on clusters via init scripts). These paths are no longer treated as local files to upload. ([#3884](https://github.com/databricks/cli/pull/3884))
14+
* Pipeline catalog changes now trigger in-place updates instead of recreation (Terraform provider v1.98.0 behavior change) ([#4082](https://github.com/databricks/cli/pull/4082))
15+
16+
### Dependency updates
17+
* Bump Terraform provider to v1.98.0 ([#4082](https://github.com/databricks/cli/pull/4082))
1618

1719
### API Changes

acceptance/bin/edit_resource.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -50,7 +50,7 @@ def main():
5050
my_locals = {"r": data}
5151

5252
try:
53-
exec(script, locals=my_locals)
53+
exec(script, my_locals)
5454
except Exception:
5555
pprint.pprint(my_locals)
5656
raise

acceptance/bundle/lifecycle/prevent-destroy/script

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -5,8 +5,10 @@ trace $CLI bundle deploy
55
trace errcode $CLI bundle plan
66
trace musterr $CLI bundle destroy --auto-approve >>out.$DATABRICKS_BUNDLE_ENGINE.txt 2>&1
77

8-
# Changing the catalog name, deploy must fail because pipeline will be recreated
9-
update_file.py resources/pipeline.yml 'catalog: main' 'catalog: mainnew'
8+
# Note: In Terraform provider v1.98.0+, changing the catalog no longer triggers recreation.
9+
# We use storage location instead, which still requires recreation.
10+
# Changing from catalog to storage, deploy must fail because pipeline will be recreated
11+
update_file.py resources/pipeline.yml ' catalog: main' ' storage: "dbfs:/my-storage"'
1012
trace errcode $CLI bundle plan >>out.$DATABRICKS_BUNDLE_ENGINE.txt 2>&1
1113
trace musterr $CLI bundle deploy >>out.$DATABRICKS_BUNDLE_ENGINE.txt 2>&1
1214

acceptance/bundle/migrate/default-python/out.state_original.json

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -313,6 +313,7 @@
313313
"trigger": [
314314
{
315315
"file_arrival": [],
316+
"model": [],
316317
"pause_status": "PAUSED",
317318
"periodic": [
318319
{

acceptance/bundle/migrate/runas/out.create_requests.json

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
{
22
"headers": {
33
"User-Agent": [
4-
"databricks-tf-provider/1.97.0 databricks-sdk-go/[SDK_VERSION] go/1.24.0 os/[OS] cli/[DEV_VERSION] terraform/1.5.5 sdk/sdkv2 resource/pipeline auth/pat"
4+
"databricks-tf-provider/1.98.0 databricks-sdk-go/[SDK_VERSION] go/1.24.0 os/[OS] cli/[DEV_VERSION] terraform/1.5.5 sdk/sdkv2 resource/pipeline auth/pat"
55
]
66
},
77
"method": "POST",
@@ -32,7 +32,7 @@
3232
{
3333
"headers": {
3434
"User-Agent": [
35-
"databricks-tf-provider/1.97.0 databricks-sdk-go/[SDK_VERSION] go/1.24.0 os/[OS] cli/[DEV_VERSION] terraform/1.5.5 sdk/sdkv2 resource/permissions auth/pat"
35+
"databricks-tf-provider/1.98.0 databricks-sdk-go/[SDK_VERSION] go/1.24.0 os/[OS] cli/[DEV_VERSION] terraform/1.5.5 sdk/sdkv2 resource/permissions auth/pat"
3636
]
3737
},
3838
"method": "PUT",

acceptance/bundle/resource_deps/pipelines_recreate/databricks.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@ resources:
22
pipelines:
33
foo:
44
name: pipeline foo
5-
catalog: mycatalog
5+
storage: dbfs:/my-storage
66
jobs:
77
bar:
88
name: job bar

acceptance/bundle/resource_deps/pipelines_recreate/out.create.requests.json

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -2,14 +2,14 @@
22
"method": "POST",
33
"path": "/api/2.0/pipelines",
44
"body": {
5-
"catalog": "mycatalog",
65
"channel": "CURRENT",
76
"deployment": {
87
"kind": "BUNDLE",
98
"metadata_file_path": "/Workspace/Users/[USERNAME]/.bundle/test-bundle/default/state/metadata.json"
109
},
1110
"edition": "ADVANCED",
12-
"name": "pipeline foo"
11+
"name": "pipeline foo",
12+
"storage": "dbfs:/my-storage"
1313
}
1414
}
1515
{

acceptance/bundle/resource_deps/pipelines_recreate/out.plan_create.direct.json

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -32,14 +32,14 @@
3232
"action": "create",
3333
"new_state": {
3434
"value": {
35-
"catalog": "mycatalog",
3635
"channel": "CURRENT",
3736
"deployment": {
3837
"kind": "BUNDLE",
3938
"metadata_file_path": "/Workspace/Users/[USERNAME]/.bundle/test-bundle/default/state/metadata.json"
4039
},
4140
"edition": "ADVANCED",
42-
"name": "pipeline foo"
41+
"name": "pipeline foo",
42+
"storage": "dbfs:/my-storage"
4343
}
4444
}
4545
}

acceptance/bundle/resource_deps/pipelines_recreate/out.plan_noop.direct.json

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -57,15 +57,15 @@
5757
"pipeline_id": "[UUID]",
5858
"run_as_user_name": "[USERNAME]",
5959
"spec": {
60-
"catalog": "mynewcatalog",
6160
"channel": "CURRENT",
6261
"deployment": {
6362
"kind": "BUNDLE",
6463
"metadata_file_path": "/Workspace/Users/[USERNAME]/.bundle/test-bundle/default/state/metadata.json"
6564
},
6665
"edition": "ADVANCED",
6766
"id": "[UUID]",
68-
"name": "pipeline foo"
67+
"name": "pipeline foo",
68+
"storage": "dbfs:/my-new-storage"
6969
},
7070
"state": "IDLE"
7171
}

acceptance/bundle/resource_deps/pipelines_recreate/out.plan_update.direct.json

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -78,22 +78,22 @@
7878
"action": "recreate",
7979
"new_state": {
8080
"value": {
81-
"catalog": "mynewcatalog",
8281
"channel": "CURRENT",
8382
"deployment": {
8483
"kind": "BUNDLE",
8584
"metadata_file_path": "/Workspace/Users/[USERNAME]/.bundle/test-bundle/default/state/metadata.json"
8685
},
8786
"edition": "ADVANCED",
88-
"name": "pipeline foo"
87+
"name": "pipeline foo",
88+
"storage": "dbfs:/my-new-storage"
8989
}
9090
},
9191
"changes": {
9292
"local": {
93-
"catalog": {
93+
"storage": {
9494
"action": "recreate",
95-
"old": "mycatalog",
96-
"new": "mynewcatalog"
95+
"old": "dbfs:/my-storage",
96+
"new": "dbfs:/my-new-storage"
9797
}
9898
}
9999
}

0 commit comments

Comments
 (0)