Skip to content

Commit 59f0859

Browse files
authored
Upgrade Go SDK to 0.54.0 (#2029)
## Changes * Added [a.AccountFederationPolicy](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#AccountFederationPolicyAPI) account-level service and [a.ServicePrincipalFederationPolicy](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#ServicePrincipalFederationPolicyAPI) account-level service. * Added `IsSingleNode`, `Kind` and `UseMlRuntime` fields for Cluster commands. * Added `UpdateParameterSyntax` field for [dashboards.MigrateDashboardRequest](https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#MigrateDashboardRequest).
1 parent 042c8d8 commit 59f0859

File tree

12 files changed

+1009
-11
lines changed

12 files changed

+1009
-11
lines changed

.codegen/_openapi_sha

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1 +1 @@
1-
7016dcbf2e011459416cf408ce21143bcc4b3a25
1+
a6a317df8327c9b1e5cb59a03a42ffa2aabeef6d

.gitattributes

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -8,6 +8,7 @@ cmd/account/custom-app-integration/custom-app-integration.go linguist-generated=
88
cmd/account/disable-legacy-features/disable-legacy-features.go linguist-generated=true
99
cmd/account/encryption-keys/encryption-keys.go linguist-generated=true
1010
cmd/account/esm-enablement-account/esm-enablement-account.go linguist-generated=true
11+
cmd/account/federation-policy/federation-policy.go linguist-generated=true
1112
cmd/account/groups/groups.go linguist-generated=true
1213
cmd/account/ip-access-lists/ip-access-lists.go linguist-generated=true
1314
cmd/account/log-delivery/log-delivery.go linguist-generated=true
@@ -19,6 +20,7 @@ cmd/account/o-auth-published-apps/o-auth-published-apps.go linguist-generated=tr
1920
cmd/account/personal-compute/personal-compute.go linguist-generated=true
2021
cmd/account/private-access/private-access.go linguist-generated=true
2122
cmd/account/published-app-integration/published-app-integration.go linguist-generated=true
23+
cmd/account/service-principal-federation-policy/service-principal-federation-policy.go linguist-generated=true
2224
cmd/account/service-principal-secrets/service-principal-secrets.go linguist-generated=true
2325
cmd/account/service-principals/service-principals.go linguist-generated=true
2426
cmd/account/settings/settings.go linguist-generated=true

bundle/internal/schema/annotations_openapi.yml

Lines changed: 50 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -70,6 +70,12 @@ github.com/databricks/cli/bundle/config/resources.Cluster:
7070
If `cluster_log_conf` is specified, init script logs are sent to `<destination>/<cluster-ID>/init_scripts`.
7171
instance_pool_id:
7272
description: The optional ID of the instance pool to which the cluster belongs.
73+
is_single_node:
74+
description: |
75+
This field can only be used with `kind`.
76+
77+
When set to true, Databricks will automatically set single node related `custom_tags`, `spark_conf`, and `num_workers`
78+
kind: {}
7379
node_type_id:
7480
description: |
7581
This field encodes, through a single value, the resources available to each of
@@ -119,6 +125,11 @@ github.com/databricks/cli/bundle/config/resources.Cluster:
119125
SSH public key contents that will be added to each Spark node in this cluster. The
120126
corresponding private keys can be used to login with the user name `ubuntu` on port `2200`.
121127
Up to 10 keys can be specified.
128+
use_ml_runtime:
129+
description: |
130+
This field can only be used with `kind`.
131+
132+
`effective_spark_version` is determined by `spark_version` (DBR release), this field `use_ml_runtime`, and whether `node_type_id` is gpu node or not.
122133
workload_type: {}
123134
github.com/databricks/cli/bundle/config/resources.Dashboard:
124135
create_time:
@@ -759,6 +770,12 @@ github.com/databricks/databricks-sdk-go/service/compute.ClusterSpec:
759770
If `cluster_log_conf` is specified, init script logs are sent to `<destination>/<cluster-ID>/init_scripts`.
760771
instance_pool_id:
761772
description: The optional ID of the instance pool to which the cluster belongs.
773+
is_single_node:
774+
description: |
775+
This field can only be used with `kind`.
776+
777+
When set to true, Databricks will automatically set single node related `custom_tags`, `spark_conf`, and `num_workers`
778+
kind: {}
762779
node_type_id:
763780
description: |
764781
This field encodes, through a single value, the resources available to each of
@@ -808,13 +825,24 @@ github.com/databricks/databricks-sdk-go/service/compute.ClusterSpec:
808825
SSH public key contents that will be added to each Spark node in this cluster. The
809826
corresponding private keys can be used to login with the user name `ubuntu` on port `2200`.
810827
Up to 10 keys can be specified.
828+
use_ml_runtime:
829+
description: |
830+
This field can only be used with `kind`.
831+
832+
`effective_spark_version` is determined by `spark_version` (DBR release), this field `use_ml_runtime`, and whether `node_type_id` is gpu node or not.
811833
workload_type: {}
812834
github.com/databricks/databricks-sdk-go/service/compute.DataSecurityMode:
813835
_:
814836
description: |
815837
Data security mode decides what data governance model to use when accessing data
816838
from a cluster.
817839
840+
The following modes can only be used with `kind`.
841+
* `DATA_SECURITY_MODE_AUTO`: Databricks will choose the most appropriate access mode depending on your compute configuration.
842+
* `DATA_SECURITY_MODE_STANDARD`: Alias for `USER_ISOLATION`.
843+
* `DATA_SECURITY_MODE_DEDICATED`: Alias for `SINGLE_USER`.
844+
845+
The following modes can be used regardless of `kind`.
818846
* `NONE`: No security isolation for multiple users sharing the cluster. Data governance features are not available in this mode.
819847
* `SINGLE_USER`: A secure cluster that can only be exclusively used by a single user specified in `single_user_name`. Most programming languages, cluster features and data governance features are available in this mode.
820848
* `USER_ISOLATION`: A secure cluster that can be shared by multiple users. Cluster users are fully isolated so that they cannot see each other's data and credentials. Most data governance features are supported in this mode. But programming languages and cluster features might be limited.
@@ -827,6 +855,9 @@ github.com/databricks/databricks-sdk-go/service/compute.DataSecurityMode:
827855
* `LEGACY_SINGLE_USER`: This mode is for users migrating from legacy Passthrough on standard clusters.
828856
* `LEGACY_SINGLE_USER_STANDARD`: This mode provides a way that doesn’t have UC nor passthrough enabled.
829857
enum:
858+
- DATA_SECURITY_MODE_AUTO
859+
- DATA_SECURITY_MODE_STANDARD
860+
- DATA_SECURITY_MODE_DEDICATED
830861
- NONE
831862
- SINGLE_USER
832863
- USER_ISOLATION
@@ -1068,6 +1099,17 @@ github.com/databricks/databricks-sdk-go/service/dashboards.LifecycleState:
10681099
enum:
10691100
- ACTIVE
10701101
- TRASHED
1102+
github.com/databricks/databricks-sdk-go/service/jobs.CleanRoomsNotebookTask:
1103+
clean_room_name:
1104+
description: The clean room that the notebook belongs to.
1105+
etag:
1106+
description: |-
1107+
Checksum to validate the freshness of the notebook resource (i.e. the notebook being run is the latest version).
1108+
It can be fetched by calling the :method:cleanroomassets/get API.
1109+
notebook_base_parameters:
1110+
description: Base parameters to be used for the clean room notebook job.
1111+
notebook_name:
1112+
description: Name of the notebook being run.
10711113
github.com/databricks/databricks-sdk-go/service/jobs.Condition:
10721114
_:
10731115
enum:
@@ -1346,10 +1388,10 @@ github.com/databricks/databricks-sdk-go/service/jobs.JobsHealthMetric:
13461388
Specifies the health metric that is being evaluated for a particular health rule.
13471389
13481390
* `RUN_DURATION_SECONDS`: Expected total time for a run in seconds.
1349-
* `STREAMING_BACKLOG_BYTES`: An estimate of the maximum bytes of data waiting to be consumed across all streams. This metric is in Private Preview.
1350-
* `STREAMING_BACKLOG_RECORDS`: An estimate of the maximum offset lag across all streams. This metric is in Private Preview.
1351-
* `STREAMING_BACKLOG_SECONDS`: An estimate of the maximum consumer delay across all streams. This metric is in Private Preview.
1352-
* `STREAMING_BACKLOG_FILES`: An estimate of the maximum number of outstanding files across all streams. This metric is in Private Preview.
1391+
* `STREAMING_BACKLOG_BYTES`: An estimate of the maximum bytes of data waiting to be consumed across all streams. This metric is in Public Preview.
1392+
* `STREAMING_BACKLOG_RECORDS`: An estimate of the maximum offset lag across all streams. This metric is in Public Preview.
1393+
* `STREAMING_BACKLOG_SECONDS`: An estimate of the maximum consumer delay across all streams. This metric is in Public Preview.
1394+
* `STREAMING_BACKLOG_FILES`: An estimate of the maximum number of outstanding files across all streams. This metric is in Public Preview.
13531395
enum:
13541396
- RUN_DURATION_SECONDS
13551397
- STREAMING_BACKLOG_BYTES
@@ -1651,6 +1693,10 @@ github.com/databricks/databricks-sdk-go/service/jobs.TableUpdateTriggerConfigura
16511693
and can be used to wait for a series of table updates before triggering a run. The
16521694
minimum allowed value is 60 seconds.
16531695
github.com/databricks/databricks-sdk-go/service/jobs.Task:
1696+
clean_rooms_notebook_task:
1697+
description: |-
1698+
The task runs a [clean rooms](https://docs.databricks.com/en/clean-rooms/index.html) notebook
1699+
when the `clean_rooms_notebook_task` field is present.
16541700
condition_task:
16551701
description: |-
16561702
The task evaluates a condition that can be used to control the execution of other tasks when the `condition_task` field is present.

bundle/internal/schema/annotations_openapi_overrides.yml

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -5,6 +5,9 @@ github.com/databricks/cli/bundle/config/resources.Cluster:
55
"docker_image":
66
"description": |-
77
PLACEHOLDER
8+
"kind":
9+
"description": |-
10+
PLACEHOLDER
811
"permissions":
912
"description": |-
1013
PLACEHOLDER
@@ -90,6 +93,9 @@ github.com/databricks/databricks-sdk-go/service/compute.ClusterSpec:
9093
"docker_image":
9194
"description": |-
9295
PLACEHOLDER
96+
"kind":
97+
"description": |-
98+
PLACEHOLDER
9399
"runtime_engine":
94100
"description": |-
95101
PLACEHOLDER

0 commit comments

Comments
 (0)