Skip to content

Commit fc1d8b9

Browse files
authored
acc: Capture direct cluster deployment failing on data_security_mode: DATA_SECURITY_MODE_STANDARD (#3836)
## Changes Capture direct cluster deployment failing on data_security_mode: DATA_SECURITY_MODE_STANDARD ## Why 2nd deploy fails on clusters with data_security_mode: DATA_SECURITY_MODE_STANDARD ## Tests Added an acceptance test <!-- If your PR needs to be included in the release notes for next release, add a separate entry in NEXT_CHANGELOG.md as part of your PR. -->
1 parent 4229556 commit fc1d8b9

File tree

11 files changed

+158
-0
lines changed

11 files changed

+158
-0
lines changed
Lines changed: 17 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,17 @@
1+
bundle:
2+
name: test-deploy-cluster-simple
3+
4+
workspace:
5+
root_path: ~/.bundle/$UNIQUE_NAME
6+
7+
resources:
8+
clusters:
9+
test_cluster:
10+
cluster_name: test-cluster-$UNIQUE_NAME
11+
spark_version: $DEFAULT_SPARK_VERSION
12+
node_type_id: $NODE_TYPE_ID
13+
num_workers: 2
14+
spark_conf:
15+
"spark.executor.memory": "2g"
16+
data_security_mode: DATA_SECURITY_MODE_STANDARD
17+
kind: CLASSIC_PREVIEW
Lines changed: 20 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,20 @@
1+
2+
>>> [CLI] bundle deploy
3+
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/[UNIQUE_NAME]/files...
4+
Deploying resources...
5+
Updating deployment state...
6+
Deployment complete!
7+
8+
>>> errcode [CLI] bundle deploy
9+
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/[UNIQUE_NAME]/files...
10+
Deploying resources...
11+
Error: cannot update resources.clusters.test_cluster: updating id=[CLUSTER-ID]: Cluster [CLUSTER-ID] is in unexpected state Pending. (400 INVALID_STATE)
12+
13+
Endpoint: POST [DATABRICKS_URL]/api/2.1/clusters/edit
14+
HTTP Status: 400 Bad Request
15+
API error_code: INVALID_STATE
16+
API message: Cluster [CLUSTER-ID] is in unexpected state Pending.
17+
18+
Updating deployment state...
19+
20+
Exit code: 1
Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
2+
>>> [CLI] bundle plan
3+
update clusters.test_cluster
4+
5+
Plan: 0 to add, 1 to change, 0 to delete, 0 unchanged
Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
2+
>>> [CLI] bundle plan
3+
Plan: 0 to add, 0 to change, 0 to delete, 1 unchanged
Lines changed: 31 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,31 @@
1+
{
2+
"method": "GET",
3+
"path": "/api/2.1/clusters/get",
4+
"q": {
5+
"cluster_id": "[CLUSTER-ID]"
6+
}
7+
}
8+
{
9+
"method": "GET",
10+
"path": "/api/2.1/clusters/get",
11+
"q": {
12+
"cluster_id": "[CLUSTER-ID]"
13+
}
14+
}
15+
{
16+
"method": "POST",
17+
"path": "/api/2.1/clusters/edit",
18+
"body": {
19+
"autotermination_minutes": 60,
20+
"cluster_id": "[CLUSTER-ID]",
21+
"cluster_name": "test-cluster-[UNIQUE_NAME]",
22+
"data_security_mode": "DATA_SECURITY_MODE_STANDARD",
23+
"kind": "CLASSIC_PREVIEW",
24+
"node_type_id": "[NODE_TYPE_ID]",
25+
"num_workers": 2,
26+
"spark_conf": {
27+
"spark.executor.memory": "2g"
28+
},
29+
"spark_version": "13.3.x-snapshot-scala2.12"
30+
}
31+
}
Lines changed: 30 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,30 @@
1+
{
2+
"method": "GET",
3+
"path": "/api/2.1/clusters/get",
4+
"q": {
5+
"cluster_id": "[CLUSTER-ID]"
6+
}
7+
}
8+
{
9+
"method": "GET",
10+
"path": "/api/2.1/clusters/list",
11+
"q": {
12+
"filter_by.is_pinned": "true",
13+
"page_size": "100"
14+
}
15+
}
16+
{
17+
"method": "GET",
18+
"path": "/api/2.1/clusters/get",
19+
"q": {
20+
"cluster_id": "[CLUSTER-ID]"
21+
}
22+
}
23+
{
24+
"method": "GET",
25+
"path": "/api/2.1/clusters/list",
26+
"q": {
27+
"filter_by.is_pinned": "true",
28+
"page_size": "100"
29+
}
30+
}
Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,12 @@
1+
2+
>>> [CLI] bundle deploy
3+
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/[UNIQUE_NAME]/files...
4+
Deploying resources...
5+
Updating deployment state...
6+
Deployment complete!
7+
8+
>>> errcode [CLI] bundle deploy
9+
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/[UNIQUE_NAME]/files...
10+
Deploying resources...
11+
Updating deployment state...
12+
Deployment complete!

acceptance/bundle/resources/clusters/deploy/data_security_mode/out.test.toml

Lines changed: 5 additions & 0 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.
Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,9 @@
1+
2+
>>> [CLI] bundle destroy --auto-approve
3+
The following resources will be deleted:
4+
delete cluster test_cluster
5+
6+
All files and directories at the following location will be deleted: /Workspace/Users/[USERNAME]/.bundle/[UNIQUE_NAME]
7+
8+
Deleting files...
9+
Destroy complete!
Lines changed: 15 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,15 @@
1+
envsubst < databricks.yml.tmpl > databricks.yml
2+
3+
cleanup() {
4+
trace $CLI bundle destroy --auto-approve
5+
rm -f out.requests.txt
6+
}
7+
trap cleanup EXIT
8+
9+
trace $CLI bundle deploy > out.$DATABRICKS_BUNDLE_ENGINE.txt 2>&1
10+
print_requests.py --get //clusters > out.requests.$DATABRICKS_BUNDLE_ENGINE.txt
11+
12+
trace $CLI bundle plan >> out.plan.$DATABRICKS_BUNDLE_ENGINE.txt 2>&1
13+
14+
trace errcode $CLI bundle deploy >> out.$DATABRICKS_BUNDLE_ENGINE.txt 2>&1
15+
print_requests.py --get //clusters > out.requests.$DATABRICKS_BUNDLE_ENGINE.txt

0 commit comments

Comments
 (0)