Skip to content

Commit 863f940

Browse files
authored
Allow referencing job libraries outside bundle root without the need to specify sync root (#2842)
## Changes Allow referencing job libraries outside bundle root without the need to specify sync root. Previously it was failing with error indicating the libraries are not sync root, now it does not fail anymore. ## Why Libraries has their own upload cycle and do not rely on general bundle sync mechanism therefore it is not necessary to specify or check that libraries are in sync root. ## Tests Added acceptance tests <!-- If your PR needs to be included in the release notes for next release, add a separate entry in NEXT_CHANGELOG.md as part of your PR. -->
1 parent 07e649b commit 863f940

File tree

22 files changed

+413
-121
lines changed

22 files changed

+413
-121
lines changed

NEXT_CHANGELOG.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -16,6 +16,7 @@
1616
* Update default-python template to make DB Connect work out of the box for unit tests, using uv to install dependencies ([#3254](https://github.com/databricks/cli/pull/3254))
1717
* Add support for `TaskRetryMode` for continuous jobs ([#3529](https://github.com/databricks/cli/pull/3529))
1818
* Add support for specifying database instance as an application resource ([#3529](https://github.com/databricks/cli/pull/3529))
19+
* Allow referencing job libraries outside bundle root without the need to specify sync root ([#2842](https://github.com/databricks/cli/pull/2842))
1920
* Add top level `run_as` support for Lakeflow Declarative Pipelines ([#3307](https://github.com/databricks/cli/pull/3307))
2021

2122
### API Changes
Lines changed: 28 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,28 @@
1+
bundle:
2+
name: outside_of_bundle_root
3+
4+
variables:
5+
cluster:
6+
default:
7+
spark_version: 15.4.x-scala2.12
8+
node_type_id: i3.xlarge
9+
data_security_mode: SINGLE_USER
10+
num_workers: 0
11+
spark_conf:
12+
spark.master: "local[*, 4]"
13+
spark.databricks.cluster.profile: singleNode
14+
custom_tags:
15+
ResourceClass: SingleNode
16+
17+
resources:
18+
jobs:
19+
test:
20+
name: "test"
21+
tasks:
22+
- task_key: task1
23+
new_cluster: ${var.cluster}
24+
python_wheel_task:
25+
entry_point: main
26+
package_name: my_default_python
27+
libraries:
28+
- whl: ../*.whl
Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
Local = true
2+
Cloud = false
3+
4+
[EnvMatrix]
5+
DATABRICKS_CLI_DEPLOYMENT = ["terraform", "direct-exp"]
Lines changed: 28 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,28 @@
1+
2+
>>> [CLI] bundle validate -o json
3+
[
4+
{
5+
"whl": "../*.whl"
6+
}
7+
]
8+
9+
>>> [CLI] bundle deploy
10+
Uploading ../test.whl...
11+
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/outside_of_bundle_root/default/files...
12+
Deploying resources...
13+
Updating deployment state...
14+
Deployment complete!
15+
16+
=== Check that the job libraries are uploaded and the path is correct in the job
17+
>>> cat out.requests.txt
18+
[
19+
{
20+
"whl": "/Workspace/Users/[USERNAME]/.bundle/outside_of_bundle_root/default/artifacts/.internal/test.whl"
21+
}
22+
]
23+
24+
>>> cat out.requests.txt
25+
{
26+
"method": "POST",
27+
"path": "/api/2.0/workspace-files/import-file/Workspace/Users/[USERNAME]/.bundle/outside_of_bundle_root/default/artifacts/.internal/test.whl"
28+
}
Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,12 @@
1+
cd bundle
2+
3+
trace $CLI bundle validate -o json | jq '.resources.jobs.test.tasks[0].libraries'
4+
trace $CLI bundle deploy
5+
6+
cd ..
7+
8+
title "Check that the job libraries are uploaded and the path is correct in the job"
9+
trace cat out.requests.txt | jq 'select(.path == "/api/2.2/jobs/create")' | jq '.body.tasks[0].libraries'
10+
trace cat out.requests.txt | jq 'select(.path | test("/api/2.0/workspace-files/import-file/Workspace/Users/.*/.bundle/outside_of_bundle_root/default/artifacts/.internal/test.whl"))'
11+
12+
rm out.requests.txt
Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,6 @@
1+
Cloud = false
2+
RecordRequests = true
3+
4+
Ignore = [
5+
'.databricks',
6+
]

acceptance/bundle/libraries/outside_of_bundle_root/test.whl

Whitespace-only changes.

acceptance/bundle/paths/fallback/output.job.json

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,7 @@
2424
"whl": "dist/wheel1.whl"
2525
},
2626
{
27-
"whl": "dist/wheel2.whl"
27+
"whl": "../dist/wheel2.whl"
2828
}
2929
],
3030
"python_wheel_task": {
@@ -39,7 +39,7 @@
3939
"jar": "target/jar1.jar"
4040
},
4141
{
42-
"jar": "target/jar2.jar"
42+
"jar": "../target/jar2.jar"
4343
}
4444
],
4545
"spark_jar_task": {

acceptance/bundle/paths/fallback/output.txt

Lines changed: 0 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -21,12 +21,6 @@ Error: path ../src/sql.sql is defined relative to the [TEST_TMP_DIR]/resources d
2121
in override_job.yml:24:25
2222
resources/my_job.yml:27:21
2323

24-
Error: path ../dist/wheel2.whl is defined relative to the [TEST_TMP_DIR]/resources directory ([TEST_TMP_DIR]/override_job.yml:33:24). Please update the path to be relative to the file where it is defined or use earlier version of CLI (0.261.0 or earlier).
25-
in override_job.yml:33:24
26-
27-
Error: path ../target/jar2.jar is defined relative to the [TEST_TMP_DIR]/resources directory ([TEST_TMP_DIR]/override_job.yml:41:24). Please update the path to be relative to the file where it is defined or use earlier version of CLI (0.261.0 or earlier).
28-
in override_job.yml:41:24
29-
3024
Error: path ../src/notebook2.py is defined relative to the [TEST_TMP_DIR]/resources directory ([TEST_TMP_DIR]/override_pipeline.yml:13:23). Please update the path to be relative to the file where it is defined or use earlier version of CLI (0.261.0 or earlier).
3125
in override_pipeline.yml:13:23
3226

Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,11 @@
1+
bundle:
2+
name: outside_root_no_sync
3+
4+
resources:
5+
jobs:
6+
my_job:
7+
name: include_outside_root
8+
tasks:
9+
- task_key: task_key
10+
notebook_task:
11+
notebook_path: "../src/notebook.py"

0 commit comments

Comments
 (0)