Skip to content

Commit a5d0248

Browse files
authored
Correctly handle zero and null values when converting typed/dynamic (#3230)
## Changes When converting between struct and dyn.Value, respect omitempty/omitzero and ForceSendFields. The conversion should now match what would happen if you did json encode/decode. ## Why There is a lot of conversion between structs and dyn.Value and information is lost each time. This means typed structs in bundle.Config are not completely accurate. There consequences of this: - In direct deployment we cannot read resource definitions from bundle.Config, we need to go through JSON to get correct typed structs. - Some typed code does the wrong thing, there are known and probably some unknown cases. Example of known case: [acceptance/bundle/resources/jobs/instance_pool_and_node_type/test.toml](https://github.com/databricks/cli/pull/3230/files#diff-f988a9ad8b0ddd1050e215cc4dbe15f834702033421f7463003075d967e2b256)
1 parent 9440703 commit a5d0248

File tree

27 files changed

+224
-85
lines changed

27 files changed

+224
-85
lines changed

acceptance/bundle/integration_whl/base/test.toml

Lines changed: 0 additions & 1 deletion
This file was deleted.

acceptance/bundle/quality_monitor/output.txt

Lines changed: 0 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -1,13 +1,5 @@
11

22
>>> [CLI] bundle validate -o json -t development
3-
Warning: required field "quartz_cron_expression" is not set
4-
at resources.quality_monitors.my_monitor.schedule
5-
in databricks.yml:17:9
6-
7-
Warning: required field "timezone_id" is not set
8-
at resources.quality_monitors.my_monitor.schedule
9-
in databricks.yml:17:9
10-
113
{
124
"mode": "development",
135
"quality_monitors": {
@@ -23,7 +15,6 @@ Warning: required field "timezone_id" is not set
2315
"timestamp_col": "timestamp"
2416
},
2517
"output_schema_name": "main.dev",
26-
"schedule": null,
2718
"table_name": "main.test.dev"
2819
}
2920
}

acceptance/bundle/resources/jobs/instance_pool_and_node_type/out.test.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,4 +2,4 @@ Local = true
22
Cloud = false
33

44
[EnvMatrix]
5-
DATABRICKS_CLI_DEPLOYMENT = ["terraform"]
5+
DATABRICKS_CLI_DEPLOYMENT = ["terraform", "direct-exp"]

acceptance/bundle/resources/jobs/instance_pool_and_node_type/output.txt

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,6 @@
55
"new_cluster": {
66
"data_security_mode": "USER_ISOLATION",
77
"instance_pool_id": "$TEST_INSTANCE_POOL_ID",
8-
"node_type_id": "",
98
"num_workers": 1,
109
"spark_version": "$DEFAULT_SPARK_VERSION"
1110
},
@@ -27,7 +26,6 @@
2726
"new_cluster": {
2827
"data_security_mode": "USER_ISOLATION",
2928
"instance_pool_id": "$TEST_INSTANCE_POOL_ID",
30-
"node_type_id": "",
3129
"num_workers": 1,
3230
"spark_version": "$DEFAULT_SPARK_VERSION"
3331
},
Lines changed: 0 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -1,18 +1 @@
11
RecordRequests = true
2-
3-
# Fails on direct with
4-
# --- FAIL: TestAccept/bundle/resources/jobs/instance_pool_and_node_type (0.00s)
5-
# --- FAIL: TestAccept/bundle/resources/jobs/instance_pool_and_node_type/DATABRICKS_CLI_DEPLOYMENT=direct-exp (1.60s)
6-
# acceptance_test.go:1178: Writing updated bundle config to databricks.yml. BundleConfig sections: default_name
7-
# acceptance_test.go:722: Diff:
8-
# --- bundle/resources/jobs/instance_pool_and_node_type/output.txt
9-
# +++ /var/folders/5y/9kkdnjw91p11vsqwk0cvmk200000gp/T/TestAcceptbundleresourcesjobsinstance_pool_and_node_typeDATABRICKS_CLI_DEPLOYMENT=direct-exp3221363519/001/output.txt
10-
# @@ -55,6 +55,7 @@
11-
# "new_cluster": {
12-
# "data_security_mode": "USER_ISOLATION",
13-
# "instance_pool_id": "$TEST_INSTANCE_POOL_ID",
14-
# + "node_type_id": "",
15-
# "num_workers": 1,
16-
# "spark_version": "$DEFAULT_SPARK_VERSION"
17-
# },
18-
EnvMatrix.DATABRICKS_CLI_DEPLOYMENT = ["terraform"]
Lines changed: 2 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,2 @@
1-
Error: run_as section must specify exactly one identity. Neither service_principal_name nor user_name is specified
2-
in override.yml:4:12
3-
4-
5-
Exit code (musterr): 1
6-
"run_as": null,
1+
"run_as": {
2+
"run_as": {
Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1 +1 @@
1-
musterr $CLI bundle validate -o json | grep run_as
1+
$CLI bundle validate -o json | grep run_as
Lines changed: 6 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,7 @@
1-
Error: run_as section must specify exactly one identity. Neither service_principal_name nor user_name is specified
2-
in databricks.yml:4:8
1+
Name: abc
2+
Target: default
3+
Workspace:
4+
User: [USERNAME]
5+
Path: /Workspace/Users/[USERNAME]/.bundle/abc/default
36

4-
5-
Exit code (musterr): 1
6-
"run_as": null,
7+
Validation OK!
Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1 +1 @@
1-
musterr $CLI bundle validate -o json | grep run_as
1+
$CLI bundle validate
Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
Ignore = [".databricks/.gitignore"]

0 commit comments

Comments
 (0)