Skip to content

Commit e457ff8

Browse files
committed
WIP migrate command
update export DeployPrepare, make it explicit in plan commands rewrite migration to do plan + dry run deploy warning fix lint & fix Extend jobs to record full state wip migrate command & test Reject same serials in direct and terraform state file update fix warning/error in checkDashboardsModified remotely add missing files update update lint update test ident update test add 'debug states' command add bad_env Use EngineType; enforce env var engine matching state lint fix move to config/engin update message to slash toslash for state files
1 parent 3d656af commit e457ff8

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

56 files changed

+1463
-228
lines changed

acceptance/bin/print_state.py

Lines changed: 25 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,25 @@
1+
#!/usr/bin/env python3
2+
"""
3+
Print resources state from default target.
4+
5+
Note, this intentionally has no logic on guessing what is the right stat file (e.g. via DATABRICKS_BUNDLE_ENGINE)
6+
the goal is to record all states that are available.
7+
"""
8+
9+
import os
10+
11+
12+
def write(filename):
13+
data = open(filename).read()
14+
print(data, end="")
15+
if not data.endswith("\n"):
16+
print()
17+
18+
19+
filename = ".databricks/bundle/default/terraform/terraform.tfstate"
20+
if os.path.exists(filename):
21+
write(filename)
22+
23+
filename = ".databricks/bundle/default/resources.json"
24+
if os.path.exists(filename):
25+
write(filename)

acceptance/bundle/debug/output.txt

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,7 @@ Usage:
66
Available Commands:
77
plan Show deployment plan in JSON format (experimental)
88
refschema Dump all relevant fields all bundle resources
9+
states Show available state files
910

1011
Flags:
1112
-h, --help help for debug

acceptance/bundle/help/bundle-deployment-migrate/out.test.toml

Lines changed: 5 additions & 0 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.
Lines changed: 24 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,24 @@
1+
2+
>>> [CLI] bundle deployment migrate --help
3+
This command converts your bundle from using Terraform for deployment to using
4+
the Direct deployment engine. It reads resource IDs from the existing Terraform
5+
state and creates a Direct deployment state file (resources.json) with the same
6+
lineage and incremented serial number.
7+
8+
Note, the migration is performed locally only. To finalize it, run 'bundle deploy'. This will synchronize the state file
9+
to the workspace so that subsequent deploys of this bundle use direct deployment engine as well.
10+
11+
WARNING: Both direct deployment engine and this command are experimental and not recommended for production targets yet.
12+
13+
Usage:
14+
databricks bundle deployment migrate [flags]
15+
16+
Flags:
17+
-h, --help help for migrate
18+
19+
Global Flags:
20+
--debug enable debug logging
21+
-o, --output type output type: text or json (default text)
22+
-p, --profile string ~/.databrickscfg profile
23+
-t, --target string bundle target to use (if applicable)
24+
--var strings set values for variables defined in bundle config. Example: --var="foo=bar"
Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
trace $CLI bundle deployment migrate --help

acceptance/bundle/help/bundle-deployment/output.txt

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -21,6 +21,7 @@ Usage:
2121

2222
Available Commands:
2323
bind Bind bundle-defined resources to existing resources
24+
migrate Migrate from Terraform to Direct deployment engine
2425
unbind Unbind bundle-defined resources from its managed remote resource
2526

2627
Flags:
Lines changed: 45 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,45 @@
1+
bundle:
2+
name: migrate-basic-test
3+
4+
resources:
5+
jobs:
6+
test_job:
7+
name: "Test Migration Job"
8+
tasks:
9+
- task_key: "main"
10+
notebook_task:
11+
notebook_path: "./notebook.py"
12+
# permissions dont work yet
13+
#permissions:
14+
# - level: CAN_VIEW
15+
# user_name: [email protected]
16+
volumes:
17+
test_volume:
18+
catalog_name: "mycat"
19+
schema_name: "myschema"
20+
name: "myvol"
21+
22+
pipelines:
23+
test_pipeline:
24+
name: "Test Migration Pipeline"
25+
tags:
26+
# ids
27+
myjob_id: ${resources.jobs.test_job.id}
28+
myvolume_id: ${resources.volumes.test_volume.id}
29+
30+
# local field, string:
31+
myjob_name: ${resources.jobs.test_job.name}
32+
volume_catalog_name: ${resources.volumes.test_volume.catalog_name}
33+
34+
# Remote fields cause permanent drift (unrelated to migration)
35+
# remote field, int, null
36+
myjob_timeout: ${resources.jobs.test_job.timeout_seconds}
37+
38+
# remote field, string:
39+
volume_storage_location: ${resources.volumes.test_volume.storage_location}
40+
libraries:
41+
- notebook:
42+
path: "./pipeline.py"
43+
#permissions:
44+
# - level: CAN_MANAGE
45+
# user_name: [email protected]
Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,2 @@
1+
# Databricks notebook source
2+
print("Hello from test migration job")
Lines changed: 66 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,66 @@
1+
{
2+
"lineage": "[UUID]",
3+
"serial": 5,
4+
"state": {
5+
"resources.jobs.test_job": {
6+
"__id__": "[NUMID]",
7+
"state": {
8+
"deployment": {
9+
"kind": "BUNDLE",
10+
"metadata_file_path": "/Workspace/Users/[USERNAME]/.bundle/migrate-basic-test/default/state/metadata.json"
11+
},
12+
"edit_mode": "UI_LOCKED",
13+
"format": "MULTI_TASK",
14+
"max_concurrent_runs": 1,
15+
"name": "Test Migration Job",
16+
"queue": {
17+
"enabled": true
18+
},
19+
"tasks": [
20+
{
21+
"notebook_task": {
22+
"notebook_path": "/Workspace/Users/[USERNAME]/.bundle/migrate-basic-test/default/files/notebook"
23+
},
24+
"task_key": "main"
25+
}
26+
]
27+
}
28+
},
29+
"resources.pipelines.test_pipeline": {
30+
"__id__": "[UUID]",
31+
"state": {
32+
"channel": "CURRENT",
33+
"deployment": {
34+
"kind": "BUNDLE",
35+
"metadata_file_path": "/Workspace/Users/[USERNAME]/.bundle/migrate-basic-test/default/state/metadata.json"
36+
},
37+
"edition": "ADVANCED",
38+
"libraries": [
39+
{
40+
"notebook": {
41+
"path": "/Workspace/Users/[USERNAME]/.bundle/migrate-basic-test/default/files/pipeline"
42+
}
43+
}
44+
],
45+
"name": "Test Migration Pipeline",
46+
"tags": {
47+
"myjob_id": "[NUMID]",
48+
"myjob_name": "Test Migration Job",
49+
"myjob_timeout": "",
50+
"myvolume_id": "mycat.myschema.myvol",
51+
"volume_catalog_name": "mycat",
52+
"volume_storage_location": "s3://deco-uc-prod-isolated-aws-us-east-1/metastore/[UUID]/volumes/[UUID]"
53+
}
54+
}
55+
},
56+
"resources.volumes.test_volume": {
57+
"__id__": "mycat.myschema.myvol",
58+
"state": {
59+
"catalog_name": "mycat",
60+
"name": "myvol",
61+
"schema_name": "myschema",
62+
"volume_type": "MANAGED"
63+
}
64+
}
65+
}
66+
}

0 commit comments

Comments
 (0)