Skip to content
Merged
Show file tree
Hide file tree
Changes from 24 commits
Commits
Show all changes
34 commits
Select commit Hold shift + click to select a range
45a5bab
WIP migrate command
denik Oct 30, 2025
9212d18
Update cmd/bundle/deployment/migrate.go
denik Nov 5, 2025
72bbaca
move bundle.engine to comment
denik Nov 5, 2025
e8c3be7
update
denik Nov 5, 2025
7677631
targets test
denik Nov 5, 2025
a136ffd
lint
denik Nov 5, 2025
7ee356e
update tests
denik Nov 18, 2025
905f4ed
lint
denik Nov 18, 2025
5191afb
rm out.requests.txt
denik Nov 18, 2025
f489e5b
fix
denik Nov 19, 2025
73b172e
add grants test
denik Nov 20, 2025
30cad4c
update grants test
denik Nov 20, 2025
f95f821
clean up
denik Nov 20, 2025
de9557b
add deploy after migration
denik Nov 20, 2025
7d2f6c0
add permissions test
denik Nov 20, 2025
492893d
add dashboards test
denik Nov 20, 2025
3ce4763
convert push to a function
denik Nov 21, 2025
a37bd16
backup local and remote state
denik Nov 21, 2025
5029dbc
tweak language
denik Nov 21, 2025
bc0db6c
handle deploy error
denik Nov 21, 2025
53ccd5e
return false
denik Nov 21, 2025
f3b1a9c
fix typo
denik Nov 21, 2025
24aa67c
Perform backup during deploy
denik Nov 24, 2025
6f58082
update test
denik Nov 25, 2025
243d380
Apply suggestions from code review
denik Nov 26, 2025
49ac439
add missing returbn
denik Nov 26, 2025
7bc4f34
clean up tempStateFile
denik Nov 26, 2025
10ae747
clean up temp file
denik Nov 26, 2025
15e8450
add a comment about etags
denik Nov 26, 2025
d4b52a7
fix permissions test
denik Nov 26, 2025
cddb3e4
extra check
denik Nov 26, 2025
aa3f032
extra plan
denik Nov 26, 2025
a2100dd
rm BackupRemoteTerraformState option
denik Nov 26, 2025
7de61e4
move comment to begin of line
denik Nov 27, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
33 changes: 27 additions & 6 deletions acceptance/bin/print_state.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@
"""

import os
import argparse


def write(filename):
Expand All @@ -16,10 +17,30 @@ def write(filename):
print()


filename = ".databricks/bundle/default/terraform/terraform.tfstate"
if os.path.exists(filename):
write(filename)
def main():
parser = argparse.ArgumentParser()
parser.add_argument("-t", "--target", default="default")
parser.add_argument("--backup", action="store_true")
args = parser.parse_args()

filename = ".databricks/bundle/default/resources.json"
if os.path.exists(filename):
write(filename)
if args.target:
target_dir = f".databricks/bundle/{args.target}"
if not os.path.exists(target_dir):
raise SystemExit(f"Invalid target {args.target!r}: {target_dir} does not exist")

if args.backup:
filename = f".databricks/bundle/{args.target}/terraform/terraform.tfstate.backup"
if os.path.exists(filename):
write(filename)
else:
filename = f".databricks/bundle/{args.target}/terraform/terraform.tfstate"
if os.path.exists(filename):
write(filename)

filename = f".databricks/bundle/{args.target}/resources.json"
if os.path.exists(filename):
write(filename)


if __name__ == "__main__":
main()

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

24 changes: 24 additions & 0 deletions acceptance/bundle/help/bundle-deployment-migrate/output.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@

>>> [CLI] bundle deployment migrate --help
This command converts your bundle from using Terraform for deployment to using
the Direct deployment engine. It reads resource IDs from the existing Terraform
state and creates a Direct deployment state file (resources.json) with the same
lineage and incremented serial number.

Note, the migration is performed locally only. To finalize it, run 'bundle deploy'. This will synchronize the state file
to the workspace so that subsequent deploys of this bundle use direct deployment engine as well.

WARNING: Both direct deployment engine and this command are experimental and not recommended for production targets yet.

Usage:
databricks bundle deployment migrate [flags]

Flags:
-h, --help help for migrate

Global Flags:
--debug enable debug logging
-o, --output type output type: text or json (default text)
-p, --profile string ~/.databrickscfg profile
-t, --target string bundle target to use (if applicable)
--var strings set values for variables defined in bundle config. Example: --var="foo=bar"
1 change: 1 addition & 0 deletions acceptance/bundle/help/bundle-deployment-migrate/script
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
trace $CLI bundle deployment migrate --help
1 change: 1 addition & 0 deletions acceptance/bundle/help/bundle-deployment/output.txt
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,7 @@ Usage:

Available Commands:
bind Bind bundle-defined resources to existing resources
migrate Migrate from Terraform to Direct deployment engine
unbind Unbind bundle-defined resources from its managed remote resource

Flags:
Expand Down
47 changes: 47 additions & 0 deletions acceptance/bundle/migrate/basic/databricks.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
bundle:
name: migrate-basic-test

resources:
jobs:
test_job:
name: "Test Migration Job"
tasks:
- task_key: "main"
notebook_task:
notebook_path: "./notebook.py"
volumes:
test_volume:
catalog_name: "mycat"
schema_name: "myschema"
name: "myvol"

pipelines:
test_pipeline:
name: "Test Migration Pipeline"
tags:
# ids
myjob_id: ${resources.jobs.test_job.id}
myvolume_id: ${resources.volumes.test_volume.id}

# local field, string:
myjob_name: ${resources.jobs.test_job.name}
volume_catalog_name: ${resources.volumes.test_volume.catalog_name}

# remote field, int, null
myjob_timeout: ${resources.jobs.test_job.timeout_seconds}

# remote field, string:
volume_storage_location: ${resources.volumes.test_volume.storage_location}
libraries:
- notebook:
path: "./pipeline.py"

targets:
dev:
default: true
prod:
resources:
schemas:
test_schema:
catalog_name: mycat
name: myschema
2 changes: 2 additions & 0 deletions acceptance/bundle/migrate/basic/notebook.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
# Databricks notebook source
print("Hello from test migration job")
66 changes: 66 additions & 0 deletions acceptance/bundle/migrate/basic/out.new_state.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,66 @@
{
"lineage": "[UUID]",
"serial": 5,
"state": {
"resources.jobs.test_job": {
"__id__": "[NUMID]",
"state": {
"deployment": {
"kind": "BUNDLE",
"metadata_file_path": "/Workspace/Users/[USERNAME]/.bundle/migrate-basic-test/dev/state/metadata.json"
},
"edit_mode": "UI_LOCKED",
"format": "MULTI_TASK",
"max_concurrent_runs": 1,
"name": "Test Migration Job",
"queue": {
"enabled": true
},
"tasks": [
{
"notebook_task": {
"notebook_path": "/Workspace/Users/[USERNAME]/.bundle/migrate-basic-test/dev/files/notebook"
},
"task_key": "main"
}
]
}
},
"resources.pipelines.test_pipeline": {
"__id__": "[UUID]",
"state": {
"channel": "CURRENT",
"deployment": {
"kind": "BUNDLE",
"metadata_file_path": "/Workspace/Users/[USERNAME]/.bundle/migrate-basic-test/dev/state/metadata.json"
},
"edition": "ADVANCED",
"libraries": [
{
"notebook": {
"path": "/Workspace/Users/[USERNAME]/.bundle/migrate-basic-test/dev/files/pipeline"
}
}
],
"name": "Test Migration Pipeline",
"tags": {
"myjob_id": "[NUMID]",
"myjob_name": "Test Migration Job",
"myjob_timeout": "",
"myvolume_id": "mycat.myschema.myvol",
"volume_catalog_name": "mycat",
"volume_storage_location": "s3://deco-uc-prod-isolated-aws-us-east-1/metastore/[UUID]/volumes/[UUID]"
}
}
},
"resources.volumes.test_volume": {
"__id__": "mycat.myschema.myvol",
"state": {
"catalog_name": "mycat",
"name": "myvol",
"schema_name": "myschema",
"volume_type": "MANAGED"
}
}
}
}
Loading