Skip to content

Commit 76440bf

Browse files
committed
Upgrade Go SDK to v0.92.0
1 parent d277cc4 commit 76440bf

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

42 files changed

+1232
-176
lines changed

.codegen/_openapi_sha

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1 +1 @@
1-
59c4c0f3d5f0ef00cd5350b5674e941a7606d91a
1+
8f5eedbc991c4f04ce1284406577b0c92d59a224

bundle/internal/schema/annotations_openapi.yml

Lines changed: 63 additions & 21 deletions
Original file line numberDiff line numberDiff line change
@@ -450,8 +450,7 @@ github.com/databricks/cli/bundle/config/resources.Job:
450450
"environments":
451451
"description": |-
452452
A list of task execution environment specifications that can be referenced by serverless tasks of this job.
453-
An environment is required to be present for serverless tasks.
454-
For serverless notebook tasks, the environment is accessible in the notebook environment panel.
453+
For serverless notebook tasks, if the environment_key is not specified, the notebook environment will be used if present. If a jobs environment is specified, it will override the notebook environment.
455454
For other serverless tasks, the task environment is required to be specified using environment_key in the task settings.
456455
"format":
457456
"description": |-
@@ -1467,7 +1466,7 @@ github.com/databricks/databricks-sdk-go/service/compute.AwsAttributes:
14671466
This string will be of a form like "us-west-2a". The provided availability
14681467
zone must be in the same region as the Databricks deployment. For example, "us-west-2a"
14691468
is not a valid zone id if the Databricks deployment resides in the "us-east-1" region.
1470-
This is an optional field at cluster creation, and if not specified, a default zone will be used.
1469+
This is an optional field at cluster creation, and if not specified, the zone "auto" will be used.
14711470
If the zone specified is "auto", will try to place cluster in a zone with high availability,
14721471
and will retry placement in a different AZ if there is not enough capacity.
14731472
@@ -1841,8 +1840,6 @@ github.com/databricks/databricks-sdk-go/service/compute.Environment:
18411840
"java_dependencies":
18421841
"description": |-
18431842
List of java dependencies. Each dependency is a string representing a java library path. For example: `/Volumes/path/to/test.jar`.
1844-
"x-databricks-preview": |-
1845-
PRIVATE
18461843
github.com/databricks/databricks-sdk-go/service/compute.GcpAttributes:
18471844
"_":
18481845
"description": |-
@@ -2173,6 +2170,9 @@ github.com/databricks/databricks-sdk-go/service/database.NewPipelineSpec:
21732170
"description": |-
21742171
Custom fields that user can set for pipeline while creating SyncedDatabaseTable.
21752172
Note that other fields of pipeline are still inferred by table def internally
2173+
"budget_policy_id":
2174+
"description": |-
2175+
Budget policy to set on the newly created pipeline.
21762176
"storage_catalog":
21772177
"description": |-
21782178
This field needs to be specified if the destination catalog is a managed postgres catalog.
@@ -2899,6 +2899,35 @@ github.com/databricks/databricks-sdk-go/service/jobs.JobsHealthRules:
28992899
"description": |-
29002900
An optional set of health rules that can be defined for this job.
29012901
"rules": {}
2902+
github.com/databricks/databricks-sdk-go/service/jobs.ModelTriggerConfiguration:
2903+
"aliases":
2904+
"description": |-
2905+
Aliases of the model versions to monitor. Can only be used in conjunction with condition MODEL_ALIAS_SET.
2906+
"condition":
2907+
"description": |-
2908+
The condition based on which to trigger a job run.
2909+
"min_time_between_triggers_seconds":
2910+
"description": |-
2911+
If set, the trigger starts a run only after the specified amount of time has passed since
2912+
the last time the trigger fired. The minimum allowed value is 60 seconds.
2913+
"securable_name":
2914+
"description": |-
2915+
Name of the securable to monitor ("mycatalog.myschema.mymodel" in the case of model-level triggers,
2916+
"mycatalog.myschema" in the case of schema-level triggers) or empty in the case of metastore-level triggers.
2917+
"wait_after_last_change_seconds":
2918+
"description": |-
2919+
If set, the trigger starts a run only after no model updates have occurred for the specified time
2920+
and can be used to wait for a series of model updates before triggering a run. The
2921+
minimum allowed value is 60 seconds.
2922+
github.com/databricks/databricks-sdk-go/service/jobs.ModelTriggerConfigurationCondition:
2923+
"_":
2924+
"enum":
2925+
- |-
2926+
MODEL_CREATED
2927+
- |-
2928+
MODEL_VERSION_READY
2929+
- |-
2930+
MODEL_ALIAS_SET
29022931
github.com/databricks/databricks-sdk-go/service/jobs.NotebookTask:
29032932
"base_parameters":
29042933
"description": |-
@@ -3516,6 +3545,9 @@ github.com/databricks/databricks-sdk-go/service/jobs.TriggerSettings:
35163545
"file_arrival":
35173546
"description": |-
35183547
File arrival trigger settings.
3548+
"model":
3549+
"x-databricks-preview": |-
3550+
PRIVATE
35193551
"pause_status":
35203552
"description": |-
35213553
Whether this trigger is paused or not.
@@ -3564,6 +3596,15 @@ github.com/databricks/databricks-sdk-go/service/ml.ModelTag:
35643596
"value":
35653597
"description": |-
35663598
The tag value.
3599+
github.com/databricks/databricks-sdk-go/service/pipelines.ConnectionParameters:
3600+
"source_catalog":
3601+
"description": |-
3602+
Source catalog for initial connection.
3603+
This is necessary for schema exploration in some database systems like Oracle, and optional but nice-to-have
3604+
in some other database systems like Postgres.
3605+
For Oracle databases, this maps to a service name.
3606+
"x-databricks-preview": |-
3607+
PRIVATE
35673608
github.com/databricks/databricks-sdk-go/service/pipelines.CronTrigger:
35683609
"quartz_cron_schedule": {}
35693610
"timezone_id": {}
@@ -3638,21 +3679,34 @@ github.com/databricks/databricks-sdk-go/service/pipelines.IngestionGatewayPipeli
36383679
"connection_name":
36393680
"description": |-
36403681
Immutable. The Unity Catalog connection that this gateway pipeline uses to communicate with the source.
3682+
"connection_parameters":
3683+
"description": |-
3684+
Optional, Internal. Parameters required to establish an initial connection with the source.
3685+
"x-databricks-preview": |-
3686+
PRIVATE
36413687
"gateway_storage_catalog":
36423688
"description": |-
36433689
Required, Immutable. The name of the catalog for the gateway pipeline's storage location.
36443690
"gateway_storage_name":
36453691
"description": |-
36463692
Optional. The Unity Catalog-compatible name for the gateway storage location.
36473693
This is the destination to use for the data that is extracted by the gateway.
3648-
Delta Live Tables system will automatically create the storage location under the catalog and schema.
3694+
Spark Declarative Pipelines system will automatically create the storage location under the catalog and schema.
36493695
"gateway_storage_schema":
36503696
"description": |-
36513697
Required, Immutable. The name of the schema for the gateway pipelines's storage location.
36523698
github.com/databricks/databricks-sdk-go/service/pipelines.IngestionPipelineDefinition:
36533699
"connection_name":
36543700
"description": |-
36553701
Immutable. The Unity Catalog connection that this ingestion pipeline uses to communicate with the source. This is used with connectors for applications like Salesforce, Workday, and so on.
3702+
"ingest_from_uc_foreign_catalog":
3703+
"description": |-
3704+
Immutable. If set to true, the pipeline will ingest tables from the
3705+
UC foreign catalogs directly without the need to specify a UC connection or ingestion gateway.
3706+
The `source_catalog` fields in objects of IngestionConfig are interpreted as
3707+
the UC foreign catalogs to ingest from.
3708+
"x-databricks-preview": |-
3709+
PRIVATE
36563710
"ingestion_gateway_id":
36573711
"description": |-
36583712
Immutable. Identifier for the gateway that is used by this ingestion pipeline to communicate with the source database. This is used with connectors to databases like SQL Server.
@@ -3669,8 +3723,6 @@ github.com/databricks/databricks-sdk-go/service/pipelines.IngestionPipelineDefin
36693723
"source_configurations":
36703724
"description": |-
36713725
Top-level source configurations
3672-
"x-databricks-preview": |-
3673-
PRIVATE
36743726
"source_type":
36753727
"description": |-
36763728
The type of the foreign source.
@@ -3783,6 +3835,8 @@ github.com/databricks/databricks-sdk-go/service/pipelines.IngestionSourceType:
37833835
SHAREPOINT
37843836
- |-
37853837
DYNAMICS365
3838+
- |-
3839+
FOREIGN_CATALOG
37863840
github.com/databricks/databricks-sdk-go/service/pipelines.ManualTrigger: {}
37873841
github.com/databricks/databricks-sdk-go/service/pipelines.NotebookLibrary:
37883842
"path":
@@ -3985,22 +4039,16 @@ github.com/databricks/databricks-sdk-go/service/pipelines.PostgresCatalogConfig:
39854039
"slot_config":
39864040
"description": |-
39874041
Optional. The Postgres slot configuration to use for logical replication
3988-
"x-databricks-preview": |-
3989-
PRIVATE
39904042
github.com/databricks/databricks-sdk-go/service/pipelines.PostgresSlotConfig:
39914043
"_":
39924044
"description": |-
39934045
PostgresSlotConfig contains the configuration for a Postgres logical replication slot
39944046
"publication_name":
39954047
"description": |-
39964048
The name of the publication to use for the Postgres source
3997-
"x-databricks-preview": |-
3998-
PRIVATE
39994049
"slot_name":
40004050
"description": |-
40014051
The name of the logical replication slot to use for the Postgres source
4002-
"x-databricks-preview": |-
4003-
PRIVATE
40044052
github.com/databricks/databricks-sdk-go/service/pipelines.ReportSpec:
40054053
"destination_catalog":
40064054
"description": |-
@@ -4065,19 +4113,13 @@ github.com/databricks/databricks-sdk-go/service/pipelines.SourceCatalogConfig:
40654113
"postgres":
40664114
"description": |-
40674115
Postgres-specific catalog-level configuration parameters
4068-
"x-databricks-preview": |-
4069-
PRIVATE
40704116
"source_catalog":
40714117
"description": |-
40724118
Source catalog name
4073-
"x-databricks-preview": |-
4074-
PRIVATE
40754119
github.com/databricks/databricks-sdk-go/service/pipelines.SourceConfig:
40764120
"catalog":
40774121
"description": |-
40784122
Catalog-level source configuration parameters
4079-
"x-databricks-preview": |-
4080-
PRIVATE
40814123
github.com/databricks/databricks-sdk-go/service/pipelines.TableSpec:
40824124
"destination_catalog":
40834125
"description": |-
@@ -4134,7 +4176,7 @@ github.com/databricks/databricks-sdk-go/service/pipelines.TableSpecificConfig:
41344176
PRIVATE
41354177
"sequence_by":
41364178
"description": |-
4137-
The column names specifying the logical order of events in the source data. Delta Live Tables uses this sequencing to handle change events that arrive out of order.
4179+
The column names specifying the logical order of events in the source data. Spark Declarative Pipelines uses this sequencing to handle change events that arrive out of order.
41384180
"workday_report_parameters":
41394181
"description": |-
41404182
(Optional) Additional custom parameters for Workday Report

bundle/internal/schema/annotations_openapi_overrides.yml

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -594,6 +594,10 @@ github.com/databricks/cli/bundle/config/resources.SqlWarehousePermissionLevel:
594594
CAN_MONITOR
595595
- |-
596596
CAN_VIEW
597+
github.com/databricks/cli/bundle/config/resources.SyncedDatabaseTable:
598+
"lifecycle":
599+
"description": |-
600+
PLACEHOLDER
597601
github.com/databricks/cli/bundle/config/resources.Volume:
598602
"_":
599603
"markdown_description": |-
@@ -891,6 +895,9 @@ github.com/databricks/databricks-sdk-go/service/jobs.Task:
891895
"description": |-
892896
PLACEHOLDER
893897
github.com/databricks/databricks-sdk-go/service/jobs.TriggerSettings:
898+
"model":
899+
"description": |-
900+
PLACEHOLDER
894901
"table_update":
895902
"description": |-
896903
PLACEHOLDER

bundle/internal/validation/generated/enum_fields.go

Lines changed: 3 additions & 2 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

bundle/internal/validation/generated/required_fields.go

Lines changed: 2 additions & 0 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

0 commit comments

Comments
 (0)