Skip to content

Commit aa5268a

Browse files
committed
create an acceptance test for provisioning a pipeline with a duplicate name
1 parent ec51221 commit aa5268a

File tree

6 files changed

+71
-0
lines changed

6 files changed

+71
-0
lines changed
Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,11 @@
1+
bundle:
2+
name: acc-bundle-deploy-pipeline-duplicate-names-$UNIQUE_NAME
3+
4+
resources:
5+
pipelines:
6+
pipeline_one:
7+
name: test-pipeline-same-name-$UNIQUE_NAME
8+
allow_duplicate_names: true
9+
libraries:
10+
- file:
11+
path: "./foo.py"
Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
Local = true
2+
Cloud = true
3+
4+
[EnvMatrix]
5+
DATABRICKS_CLI_DEPLOYMENT = ["terraform", "direct-exp"]
Lines changed: 25 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,25 @@
1+
2+
>>> [CLI] bundle deploy
3+
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/acc-bundle-deploy-pipeline-duplicate-names-[UNIQUE_NAME]/default/files...
4+
Deploying resources...
5+
Error: terraform apply: exit status 1
6+
7+
Error: cannot create pipeline: The pipeline name 'test-pipeline-same-name-[UNIQUE_NAME]' is already used by another pipeline. This check can be skipped by setting `allow_duplicate_names = true` in the request.
8+
9+
with databricks_pipeline.pipeline_one,
10+
on bundle.tf.json line 30, in resource.databricks_pipeline.pipeline_one:
11+
30: }
12+
13+
14+
15+
Updating deployment state...
16+
17+
>>> [CLI] bundle destroy --auto-approve
18+
All files and directories at the following location will be deleted: /Workspace/Users/[USERNAME]/.bundle/acc-bundle-deploy-pipeline-duplicate-names-[UNIQUE_NAME]/default
19+
20+
Deleting files...
21+
Destroy complete!
22+
23+
>>> [CLI] pipelines delete [UUID]
24+
25+
Exit code: 1
Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,11 @@
1+
{
2+
"name": "test-pipeline-same-name-$UNIQUE_NAME",
3+
"allow_duplicate_names": true,
4+
"libraries": [
5+
{
6+
"file": {
7+
"path": "/some-script.py"
8+
}
9+
}
10+
]
11+
}
Lines changed: 16 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,16 @@
1+
envsubst < databricks.yml.tmpl > databricks.yml
2+
envsubst < pipeline.json.tmpl > pipeline.json
3+
touch foo.py
4+
5+
cleanup() {
6+
trace $CLI bundle destroy --auto-approve
7+
trace $CLI pipelines delete ${PIPELINE_ID}
8+
}
9+
trap cleanup EXIT
10+
11+
# Create a pre-existing pipeline:
12+
PIPELINE_ID=$($CLI pipelines create --json @pipeline.json | jq -r .pipeline_id)
13+
export PIPELINE_ID
14+
15+
# Deploy the bundle that has a pipeline with the same name:
16+
trace $CLI bundle deploy
Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
Badness = "allow_duplicate_names field does not allow to create a pipeline with a duplicate name"
2+
Cloud = true
3+
Ignore = ["foo.py","pipeline.json"]

0 commit comments

Comments
 (0)