Skip to content

Commit e5a4da5

Browse files
authored
Added support for lifecycle prevent_destroy option (#3448)
## Changes Added support for lifecycle prevent_destroy option In TF, if lifecycle.prevent_destroy is set to true, `terraform plan` will fail with a corresponding error. That's why we can't really reuse the same code logic for both direct and TF. Direct implementation mimics the TF one. A. When `lifecycle.prevent_destroy is set to true`: 1. `bundle destroy` will fail 2. if there are any configuration changes that recreate the resource, `bundle deploy` will fail B. When `lifecycle.prevent_destroy is switched to true` destroy and deploy will succeed C. When the whole resource is removed from bundle confiuguration, destroy and deploy will succeed ## Why Allows specifying resources as non-destructible, which is already supported by TF ## Tests Added accceptance tests <!-- If your PR needs to be included in the release notes for next release, add a separate entry in NEXT_CHANGELOG.md as part of your PR. -->
1 parent 9429f2a commit e5a4da5

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

69 files changed

+1013
-37
lines changed
Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
bundle:
2+
name: prevent-destroy
3+
4+
include:
5+
- resources/*.yml
Lines changed: 67 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,67 @@
1+
2+
>>> musterr [CLI] bundle destroy --auto-approve
3+
Error: resource my_pipelines has lifecycle.prevent_destroy set, but the plan calls for this resource to be recreated or destroyed. To avoid this error, disable lifecycle.prevent_destroy for pipelines.my_pipelines
4+
resource my_schema has lifecycle.prevent_destroy set, but the plan calls for this resource to be recreated or destroyed. To avoid this error, disable lifecycle.prevent_destroy for schemas.my_schema
5+
6+
7+
Exit code (musterr): 1
8+
9+
>>> errcode [CLI] bundle plan
10+
recreate pipelines.my_pipelines
11+
12+
>>> musterr [CLI] bundle deploy
13+
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/prevent-destroy/default/files...
14+
Error: resource my_pipelines has lifecycle.prevent_destroy set, but the plan calls for this resource to be recreated or destroyed. To avoid this error, disable lifecycle.prevent_destroy for pipelines.my_pipelines
15+
16+
17+
Exit code (musterr): 1
18+
19+
>>> errcode [CLI] bundle plan
20+
recreate pipelines.my_pipelines
21+
recreate schemas.my_schema
22+
23+
>>> musterr [CLI] bundle deploy
24+
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/prevent-destroy/default/files...
25+
Error: resource my_pipelines has lifecycle.prevent_destroy set, but the plan calls for this resource to be recreated or destroyed. To avoid this error, disable lifecycle.prevent_destroy for pipelines.my_pipelines
26+
resource my_schema has lifecycle.prevent_destroy set, but the plan calls for this resource to be recreated or destroyed. To avoid this error, disable lifecycle.prevent_destroy for schemas.my_schema
27+
28+
29+
Exit code (musterr): 1
30+
31+
>>> errcode [CLI] bundle plan
32+
recreate pipelines.my_pipelines
33+
recreate schemas.my_schema
34+
35+
>>> [CLI] bundle deploy --auto-approve
36+
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/prevent-destroy/default/files...
37+
38+
This action will result in the deletion or recreation of the following UC schemas. Any underlying data may be lost:
39+
recreate schema my_schema
40+
41+
This action will result in the deletion or recreation of the following Lakeflow Declarative Pipelines along with the
42+
Streaming Tables (STs) and Materialized Views (MVs) managed by them. Recreating the pipelines will
43+
restore the defined STs and MVs through full refresh. Note that recreation is necessary when pipeline
44+
properties such as the 'catalog' or 'storage' are changed:
45+
recreate pipeline my_pipelines
46+
Deploying resources...
47+
Updating deployment state...
48+
Deployment complete!
49+
50+
>>> errcode [CLI] bundle plan
51+
delete pipelines.my_pipelines
52+
delete schemas.my_schema
53+
54+
>>> [CLI] bundle deploy --auto-approve
55+
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/prevent-destroy/default/files...
56+
57+
This action will result in the deletion or recreation of the following UC schemas. Any underlying data may be lost:
58+
delete schema my_schema
59+
60+
This action will result in the deletion or recreation of the following Lakeflow Declarative Pipelines along with the
61+
Streaming Tables (STs) and Materialized Views (MVs) managed by them. Recreating the pipelines will
62+
restore the defined STs and MVs through full refresh. Note that recreation is necessary when pipeline
63+
properties such as the 'catalog' or 'storage' are changed:
64+
delete pipeline my_pipelines
65+
Deploying resources...
66+
Updating deployment state...
67+
Deployment complete!
Lines changed: 155 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,155 @@
1+
2+
>>> musterr [CLI] bundle destroy --auto-approve
3+
Error: exit status 1
4+
5+
Error: Instance cannot be destroyed
6+
7+
on bundle.tf.json line 15, in resource.databricks_pipeline:
8+
15: "my_pipelines": {
9+
10+
Resource databricks_pipeline.my_pipelines has lifecycle.prevent_destroy set,
11+
but the plan calls for this resource to be destroyed. To avoid this error and
12+
continue with the plan, either disable lifecycle.prevent_destroy or reduce
13+
the scope of the plan using the -target flag.
14+
15+
Error: Instance cannot be destroyed
16+
17+
on bundle.tf.json line 38, in resource.databricks_schema:
18+
38: "my_schema": {
19+
20+
Resource databricks_schema.my_schema has lifecycle.prevent_destroy set, but
21+
the plan calls for this resource to be destroyed. To avoid this error and
22+
continue with the plan, either disable lifecycle.prevent_destroy or reduce
23+
the scope of the plan using the -target flag.
24+
25+
26+
27+
Exit code (musterr): 1
28+
29+
>>> errcode [CLI] bundle plan
30+
Error: exit status 1
31+
32+
Error: Instance cannot be destroyed
33+
34+
on bundle.tf.json line 15, in resource.databricks_pipeline:
35+
15: "my_pipelines": {
36+
37+
Resource databricks_pipeline.my_pipelines has lifecycle.prevent_destroy set,
38+
but the plan calls for this resource to be destroyed. To avoid this error and
39+
continue with the plan, either disable lifecycle.prevent_destroy or reduce
40+
the scope of the plan using the -target flag.
41+
42+
43+
44+
Exit code: 1
45+
46+
>>> musterr [CLI] bundle deploy
47+
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/prevent-destroy/default/files...
48+
Error: exit status 1
49+
50+
Error: Instance cannot be destroyed
51+
52+
on bundle.tf.json line 15, in resource.databricks_pipeline:
53+
15: "my_pipelines": {
54+
55+
Resource databricks_pipeline.my_pipelines has lifecycle.prevent_destroy set,
56+
but the plan calls for this resource to be destroyed. To avoid this error and
57+
continue with the plan, either disable lifecycle.prevent_destroy or reduce
58+
the scope of the plan using the -target flag.
59+
60+
61+
62+
Exit code (musterr): 1
63+
64+
>>> errcode [CLI] bundle plan
65+
Error: exit status 1
66+
67+
Error: Instance cannot be destroyed
68+
69+
on bundle.tf.json line 15, in resource.databricks_pipeline:
70+
15: "my_pipelines": {
71+
72+
Resource databricks_pipeline.my_pipelines has lifecycle.prevent_destroy set,
73+
but the plan calls for this resource to be destroyed. To avoid this error and
74+
continue with the plan, either disable lifecycle.prevent_destroy or reduce
75+
the scope of the plan using the -target flag.
76+
77+
Error: Instance cannot be destroyed
78+
79+
on bundle.tf.json line 38, in resource.databricks_schema:
80+
38: "my_schema": {
81+
82+
Resource databricks_schema.my_schema has lifecycle.prevent_destroy set, but
83+
the plan calls for this resource to be destroyed. To avoid this error and
84+
continue with the plan, either disable lifecycle.prevent_destroy or reduce
85+
the scope of the plan using the -target flag.
86+
87+
88+
89+
Exit code: 1
90+
91+
>>> musterr [CLI] bundle deploy
92+
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/prevent-destroy/default/files...
93+
Error: exit status 1
94+
95+
Error: Instance cannot be destroyed
96+
97+
on bundle.tf.json line 15, in resource.databricks_pipeline:
98+
15: "my_pipelines": {
99+
100+
Resource databricks_pipeline.my_pipelines has lifecycle.prevent_destroy set,
101+
but the plan calls for this resource to be destroyed. To avoid this error and
102+
continue with the plan, either disable lifecycle.prevent_destroy or reduce
103+
the scope of the plan using the -target flag.
104+
105+
Error: Instance cannot be destroyed
106+
107+
on bundle.tf.json line 38, in resource.databricks_schema:
108+
38: "my_schema": {
109+
110+
Resource databricks_schema.my_schema has lifecycle.prevent_destroy set, but
111+
the plan calls for this resource to be destroyed. To avoid this error and
112+
continue with the plan, either disable lifecycle.prevent_destroy or reduce
113+
the scope of the plan using the -target flag.
114+
115+
116+
117+
Exit code (musterr): 1
118+
119+
>>> errcode [CLI] bundle plan
120+
recreate pipelines.my_pipelines
121+
recreate schemas.my_schema
122+
123+
>>> [CLI] bundle deploy --auto-approve
124+
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/prevent-destroy/default/files...
125+
126+
This action will result in the deletion or recreation of the following UC schemas. Any underlying data may be lost:
127+
recreate schema my_schema
128+
129+
This action will result in the deletion or recreation of the following Lakeflow Declarative Pipelines along with the
130+
Streaming Tables (STs) and Materialized Views (MVs) managed by them. Recreating the pipelines will
131+
restore the defined STs and MVs through full refresh. Note that recreation is necessary when pipeline
132+
properties such as the 'catalog' or 'storage' are changed:
133+
recreate pipeline my_pipelines
134+
Deploying resources...
135+
Updating deployment state...
136+
Deployment complete!
137+
138+
>>> errcode [CLI] bundle plan
139+
delete pipelines.my_pipelines
140+
delete schemas.my_schema
141+
142+
>>> [CLI] bundle deploy --auto-approve
143+
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/prevent-destroy/default/files...
144+
145+
This action will result in the deletion or recreation of the following UC schemas. Any underlying data may be lost:
146+
delete schema my_schema
147+
148+
This action will result in the deletion or recreation of the following Lakeflow Declarative Pipelines along with the
149+
Streaming Tables (STs) and Materialized Views (MVs) managed by them. Recreating the pipelines will
150+
restore the defined STs and MVs through full refresh. Note that recreation is necessary when pipeline
151+
properties such as the 'catalog' or 'storage' are changed:
152+
delete pipeline my_pipelines
153+
Deploying resources...
154+
Updating deployment state...
155+
Deployment complete!
Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
Local = true
2+
Cloud = false
3+
4+
[EnvMatrix]
5+
DATABRICKS_CLI_DEPLOYMENT = ["terraform", "direct-exp"]
Lines changed: 17 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,17 @@
1+
2+
>>> [CLI] bundle validate
3+
Name: prevent-destroy
4+
Target: default
5+
Workspace:
6+
User: [USERNAME]
7+
Path: /Workspace/Users/[USERNAME]/.bundle/prevent-destroy/default
8+
9+
Validation OK!
10+
11+
>>> [CLI] bundle deploy
12+
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/prevent-destroy/default/files...
13+
Deploying resources...
14+
Updating deployment state...
15+
Deployment complete!
16+
17+
>>> errcode [CLI] bundle plan
Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,11 @@
1+
resources:
2+
pipelines:
3+
my_pipelines:
4+
name: "test-pipeline"
5+
libraries:
6+
- notebook:
7+
path: "../test-notebook.py"
8+
lifecycle:
9+
prevent_destroy: true
10+
schema: test-schema
11+
catalog: main
Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
resources:
2+
schemas:
3+
my_schema:
4+
catalog_name: "test-catalog"
5+
name: test-schema
6+
lifecycle:
7+
prevent_destroy: true
Lines changed: 30 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,30 @@
1+
trace $CLI bundle validate
2+
3+
trace $CLI bundle deploy
4+
5+
trace errcode $CLI bundle plan
6+
trace musterr $CLI bundle destroy --auto-approve >>out.$DATABRICKS_CLI_DEPLOYMENT.txt 2>&1
7+
8+
# Changing the catalog name, deploy must fail because pipeline will be recreated
9+
update_file.py resources/pipeline.yml 'catalog: main' 'catalog: mainnew'
10+
trace errcode $CLI bundle plan >>out.$DATABRICKS_CLI_DEPLOYMENT.txt 2>&1
11+
trace musterr $CLI bundle deploy >>out.$DATABRICKS_CLI_DEPLOYMENT.txt 2>&1
12+
13+
# Changing the schema name, deploy must fail because schema will be recreated
14+
update_file.py resources/schema.yml 'name: test-schema' 'name: test-schema-new'
15+
trace errcode $CLI bundle plan >>out.$DATABRICKS_CLI_DEPLOYMENT.txt 2>&1
16+
trace musterr $CLI bundle deploy >>out.$DATABRICKS_CLI_DEPLOYMENT.txt 2>&1
17+
18+
# Removing the prevent_destroy, deploy must succeed
19+
update_file.py resources/pipeline.yml 'prevent_destroy: true' 'prevent_destroy: false'
20+
update_file.py resources/schema.yml 'prevent_destroy: true' 'prevent_destroy: false'
21+
trace errcode $CLI bundle plan >>out.$DATABRICKS_CLI_DEPLOYMENT.txt 2>&1
22+
trace $CLI bundle deploy --auto-approve >>out.$DATABRICKS_CLI_DEPLOYMENT.txt 2>&1
23+
update_file.py resources/pipeline.yml 'prevent_destroy: false' 'prevent_destroy: true'
24+
update_file.py resources/schema.yml 'prevent_destroy: false' 'prevent_destroy: true'
25+
26+
27+
# Removing the pipeline and schema, deploy must succeed
28+
rm resources/pipeline.yml resources/schema.yml
29+
trace errcode $CLI bundle plan >>out.$DATABRICKS_CLI_DEPLOYMENT.txt 2>&1
30+
trace $CLI bundle deploy --auto-approve >>out.$DATABRICKS_CLI_DEPLOYMENT.txt 2>&1
Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
# Databricks notebook source
2+
3+
print("Hello, World!")
Lines changed: 32 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,32 @@
1+
EnvVaryOutput = "DATABRICKS_CLI_DEPLOYMENT"
2+
3+
Ignore = [
4+
".databricks"
5+
]
6+
7+
[[Server]]
8+
Pattern = "POST /api/2.0/serving-endpoints"
9+
Response.Body = '''
10+
{
11+
"id": "test-endpoint-6260d50f-e8ff-4905-8f28-812345678903",
12+
"name": "test-endpoint-6260d50f-e8ff-4905-8f28-812345678903"
13+
}
14+
'''
15+
16+
[[Server]]
17+
Pattern = "GET /api/2.0/serving-endpoints/"
18+
19+
20+
[[Server]]
21+
Pattern = "GET /api/2.0/serving-endpoints/test-endpoint-6260d50f-e8ff-4905-8f28-812345678903"
22+
Response.Body = '''
23+
{
24+
"id": "test-endpoint-6260d50f-e8ff-4905-8f28-812345678903",
25+
"permission_level": "CAN_MANAGE",
26+
"route_optimized": false,
27+
"state": {
28+
"config_update": "NOT_UPDATING",
29+
"ready": "NOT_READY"
30+
}
31+
}
32+
'''

0 commit comments

Comments
 (0)