Skip to content

Commit 65bc5dd

Browse files
committed
add schemas to test
1 parent fe565f5 commit 65bc5dd

File tree

8 files changed

+153
-25
lines changed

8 files changed

+153
-25
lines changed
Lines changed: 2 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -1,20 +1,5 @@
11
bundle:
22
name: prevent-destroy
33

4-
lifecycle: &lifecycle_base
5-
lifecycle:
6-
prevent_destroy: true
7-
8-
pipeline: &pipeline_base
9-
resources:
10-
pipelines:
11-
my_pipelines:
12-
name: "test-pipeline"
13-
libraries:
14-
- notebook:
15-
path: "./test-notebook.py"
16-
<<: *lifecycle_base
17-
schema: test-schema
18-
catalog: main
19-
20-
<<: *pipeline_base
4+
include:
5+
- resources/*.yml

acceptance/bundle/lifecycle/prevent-destroy/out.direct-exp.txt

Lines changed: 18 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,14 +1,23 @@
11

2+
>>> musterr [CLI] bundle destroy --auto-approve
3+
Error: resource my_pipelines has lifecycle.prevent_destroy set, but the plan calls for this resource to be recreated or destroyed. To avoid this error, disable lifecycle.prevent_destroy for pipelines.my_pipelines
4+
5+
6+
Exit code (musterr): 1
7+
28
>>> errcode [CLI] bundle plan
9+
recreate pipelines.my_pipelines
310

4-
>>> musterr [CLI] bundle destroy --auto-approve
11+
>>> musterr [CLI] bundle deploy
12+
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/prevent-destroy/default/files...
513
Error: resource my_pipelines has lifecycle.prevent_destroy set, but the plan calls for this resource to be recreated or destroyed. To avoid this error, disable lifecycle.prevent_destroy for pipelines.my_pipelines
614

715

816
Exit code (musterr): 1
917

1018
>>> errcode [CLI] bundle plan
1119
recreate pipelines.my_pipelines
20+
recreate schemas.my_schema
1221

1322
>>> musterr [CLI] bundle deploy
1423
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/prevent-destroy/default/files...
@@ -19,10 +28,14 @@ Exit code (musterr): 1
1928

2029
>>> errcode [CLI] bundle plan
2130
recreate pipelines.my_pipelines
31+
recreate schemas.my_schema
2232

2333
>>> [CLI] bundle deploy --auto-approve
2434
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/prevent-destroy/default/files...
2535

36+
This action will result in the deletion or recreation of the following UC schemas. Any underlying data may be lost:
37+
recreate schema my_schema
38+
2639
This action will result in the deletion or recreation of the following Lakeflow Declarative Pipelines along with the
2740
Streaming Tables (STs) and Materialized Views (MVs) managed by them. Recreating the pipelines will
2841
restore the defined STs and MVs through full refresh. Note that recreation is necessary when pipeline
@@ -34,10 +47,14 @@ Deployment complete!
3447

3548
>>> errcode [CLI] bundle plan
3649
delete pipelines.my_pipelines
50+
delete schemas.my_schema
3751

3852
>>> [CLI] bundle deploy --auto-approve
3953
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/prevent-destroy/default/files...
4054

55+
This action will result in the deletion or recreation of the following UC schemas. Any underlying data may be lost:
56+
delete schema my_schema
57+
4158
This action will result in the deletion or recreation of the following Lakeflow Declarative Pipelines along with the
4259
Streaming Tables (STs) and Materialized Views (MVs) managed by them. Recreating the pipelines will
4360
restore the defined STs and MVs through full refresh. Note that recreation is necessary when pipeline

acceptance/bundle/lifecycle/prevent-destroy/out.terraform.txt

Lines changed: 72 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,50 @@
11

2+
>>> musterr [CLI] bundle destroy --auto-approve
3+
Error: exit status 1
4+
5+
Error: Instance cannot be destroyed
6+
7+
on bundle.tf.json line 15, in resource.databricks_pipeline:
8+
15: "my_pipelines": {
9+
10+
Resource databricks_pipeline.my_pipelines has lifecycle.prevent_destroy set,
11+
but the plan calls for this resource to be destroyed. To avoid this error and
12+
continue with the plan, either disable lifecycle.prevent_destroy or reduce
13+
the scope of the plan using the -target flag.
14+
15+
Error: Instance cannot be destroyed
16+
17+
on bundle.tf.json line 38, in resource.databricks_schema:
18+
38: "my_schema": {
19+
20+
Resource databricks_schema.my_schema has lifecycle.prevent_destroy set, but
21+
the plan calls for this resource to be destroyed. To avoid this error and
22+
continue with the plan, either disable lifecycle.prevent_destroy or reduce
23+
the scope of the plan using the -target flag.
24+
25+
26+
27+
Exit code (musterr): 1
28+
229
>>> errcode [CLI] bundle plan
30+
Error: exit status 1
331

4-
>>> musterr [CLI] bundle destroy --auto-approve
32+
Error: Instance cannot be destroyed
33+
34+
on bundle.tf.json line 15, in resource.databricks_pipeline:
35+
15: "my_pipelines": {
36+
37+
Resource databricks_pipeline.my_pipelines has lifecycle.prevent_destroy set,
38+
but the plan calls for this resource to be destroyed. To avoid this error and
39+
continue with the plan, either disable lifecycle.prevent_destroy or reduce
40+
the scope of the plan using the -target flag.
41+
42+
43+
44+
Exit code: 1
45+
46+
>>> musterr [CLI] bundle deploy
47+
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/prevent-destroy/default/files...
548
Error: exit status 1
649

750
Error: Instance cannot be destroyed
@@ -31,6 +74,16 @@ but the plan calls for this resource to be destroyed. To avoid this error and
3174
continue with the plan, either disable lifecycle.prevent_destroy or reduce
3275
the scope of the plan using the -target flag.
3376

77+
Error: Instance cannot be destroyed
78+
79+
on bundle.tf.json line 38, in resource.databricks_schema:
80+
38: "my_schema": {
81+
82+
Resource databricks_schema.my_schema has lifecycle.prevent_destroy set, but
83+
the plan calls for this resource to be destroyed. To avoid this error and
84+
continue with the plan, either disable lifecycle.prevent_destroy or reduce
85+
the scope of the plan using the -target flag.
86+
3487

3588

3689
Exit code: 1
@@ -49,16 +102,30 @@ but the plan calls for this resource to be destroyed. To avoid this error and
49102
continue with the plan, either disable lifecycle.prevent_destroy or reduce
50103
the scope of the plan using the -target flag.
51104

105+
Error: Instance cannot be destroyed
106+
107+
on bundle.tf.json line 38, in resource.databricks_schema:
108+
38: "my_schema": {
109+
110+
Resource databricks_schema.my_schema has lifecycle.prevent_destroy set, but
111+
the plan calls for this resource to be destroyed. To avoid this error and
112+
continue with the plan, either disable lifecycle.prevent_destroy or reduce
113+
the scope of the plan using the -target flag.
114+
52115

53116

54117
Exit code (musterr): 1
55118

56119
>>> errcode [CLI] bundle plan
57120
recreate pipelines.my_pipelines
121+
recreate schemas.my_schema
58122

59123
>>> [CLI] bundle deploy --auto-approve
60124
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/prevent-destroy/default/files...
61125

126+
This action will result in the deletion or recreation of the following UC schemas. Any underlying data may be lost:
127+
recreate schema my_schema
128+
62129
This action will result in the deletion or recreation of the following Lakeflow Declarative Pipelines along with the
63130
Streaming Tables (STs) and Materialized Views (MVs) managed by them. Recreating the pipelines will
64131
restore the defined STs and MVs through full refresh. Note that recreation is necessary when pipeline
@@ -70,10 +137,14 @@ Deployment complete!
70137

71138
>>> errcode [CLI] bundle plan
72139
delete pipelines.my_pipelines
140+
delete schemas.my_schema
73141

74142
>>> [CLI] bundle deploy --auto-approve
75143
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/prevent-destroy/default/files...
76144

145+
This action will result in the deletion or recreation of the following UC schemas. Any underlying data may be lost:
146+
delete schema my_schema
147+
77148
This action will result in the deletion or recreation of the following Lakeflow Declarative Pipelines along with the
78149
Streaming Tables (STs) and Materialized Views (MVs) managed by them. Recreating the pipelines will
79150
restore the defined STs and MVs through full refresh. Note that recreation is necessary when pipeline

acceptance/bundle/lifecycle/prevent-destroy/output.txt

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -13,3 +13,5 @@ Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/prevent-destroy/de
1313
Deploying resources...
1414
Updating deployment state...
1515
Deployment complete!
16+
17+
>>> errcode [CLI] bundle plan
Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,11 @@
1+
resources:
2+
pipelines:
3+
my_pipelines:
4+
name: "test-pipeline"
5+
libraries:
6+
- notebook:
7+
path: "../test-notebook.py"
8+
lifecycle:
9+
prevent_destroy: true
10+
schema: test-schema
11+
catalog: main
Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
resources:
2+
schemas:
3+
my_schema:
4+
catalog_name: "test-catalog"
5+
name: test-schema
6+
lifecycle:
7+
prevent_destroy: true

acceptance/bundle/lifecycle/prevent-destroy/script

Lines changed: 14 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -2,21 +2,29 @@ trace $CLI bundle validate
22

33
trace $CLI bundle deploy
44

5-
trace errcode $CLI bundle plan >out.$DATABRICKS_CLI_DEPLOYMENT.txt 2>&1
5+
trace errcode $CLI bundle plan
66
trace musterr $CLI bundle destroy --auto-approve >>out.$DATABRICKS_CLI_DEPLOYMENT.txt 2>&1
77

88
# Changing the catalog name, deploy must fail because pipeline will be recreated
9-
update_file.py databricks.yml 'catalog: main' 'catalog: mainnew'
9+
update_file.py resources/pipeline.yml 'catalog: main' 'catalog: mainnew'
10+
trace errcode $CLI bundle plan >>out.$DATABRICKS_CLI_DEPLOYMENT.txt 2>&1
11+
trace musterr $CLI bundle deploy >>out.$DATABRICKS_CLI_DEPLOYMENT.txt 2>&1
12+
13+
# Changing the schema name, deploy must fail because schema will be recreated
14+
update_file.py resources/schema.yml 'name: test-schema' 'name: test-schema-new'
1015
trace errcode $CLI bundle plan >>out.$DATABRICKS_CLI_DEPLOYMENT.txt 2>&1
1116
trace musterr $CLI bundle deploy >>out.$DATABRICKS_CLI_DEPLOYMENT.txt 2>&1
1217

1318
# Removing the prevent_destroy, deploy must succeed
14-
update_file.py databricks.yml 'prevent_destroy: true' 'prevent_destroy: false'
19+
update_file.py resources/pipeline.yml 'prevent_destroy: true' 'prevent_destroy: false'
20+
update_file.py resources/schema.yml 'prevent_destroy: true' 'prevent_destroy: false'
1521
trace errcode $CLI bundle plan >>out.$DATABRICKS_CLI_DEPLOYMENT.txt 2>&1
1622
trace $CLI bundle deploy --auto-approve >>out.$DATABRICKS_CLI_DEPLOYMENT.txt 2>&1
23+
update_file.py resources/pipeline.yml 'prevent_destroy: false' 'prevent_destroy: true'
24+
update_file.py resources/schema.yml 'prevent_destroy: false' 'prevent_destroy: true'
25+
1726

18-
update_file.py databricks.yml 'prevent_destroy: false' 'prevent_destroy: true'
19-
# Removing the pipeline, deploy must succeed
20-
update_file.py databricks.yml '<<: *pipeline_base' ''
27+
# Removing the pipeline and schema, deploy must succeed
28+
rm resources/pipeline.yml resources/schema.yml
2129
trace errcode $CLI bundle plan >>out.$DATABRICKS_CLI_DEPLOYMENT.txt 2>&1
2230
trace $CLI bundle deploy --auto-approve >>out.$DATABRICKS_CLI_DEPLOYMENT.txt 2>&1

acceptance/bundle/lifecycle/prevent-destroy/test.toml

Lines changed: 27 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -3,3 +3,30 @@ EnvVaryOutput = "DATABRICKS_CLI_DEPLOYMENT"
33
Ignore = [
44
".databricks"
55
]
6+
7+
[[Server]]
8+
Pattern = "POST /api/2.0/serving-endpoints"
9+
Response.Body = '''
10+
{
11+
"id": "test-endpoint-6260d50f-e8ff-4905-8f28-812345678903",
12+
"name": "test-endpoint-6260d50f-e8ff-4905-8f28-812345678903"
13+
}
14+
'''
15+
16+
[[Server]]
17+
Pattern = "GET /api/2.0/serving-endpoints/"
18+
19+
20+
[[Server]]
21+
Pattern = "GET /api/2.0/serving-endpoints/test-endpoint-6260d50f-e8ff-4905-8f28-812345678903"
22+
Response.Body = '''
23+
{
24+
"id": "test-endpoint-6260d50f-e8ff-4905-8f28-812345678903",
25+
"permission_level": "CAN_MANAGE",
26+
"route_optimized": false,
27+
"state": {
28+
"config_update": "NOT_UPDATING",
29+
"ready": "NOT_READY"
30+
}
31+
}
32+
'''

0 commit comments

Comments
 (0)