Skip to content

Commit 0b4f456

Browse files
authored
Upgrade Go SDK to v0.89.0 (#3870)
## Changes See https://github.com/databricks/databricks-sdk-go/releases/tag/v0.89.0 This change removes previously added flags to the `jobs create` and `pipelines create` commands. These flags were not part of these commands before and were added by accident (in #3769). Because the payload for these commands is complex, they take a `--json` flag instead.
1 parent eb0df9a commit 0b4f456

File tree

80 files changed

+960
-344
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

80 files changed

+960
-344
lines changed

.codegen/_openapi_sha

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1 +1 @@
1-
c4784cea599325a13472b1455e7434d639362d8b
1+
e2018bb00cba203508f8afe5a6d41bd49789ba25

NEXT_CHANGELOG.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -5,6 +5,7 @@
55
### Notable Changes
66

77
### CLI
8+
* Remove previously added flags from the `jobs create` and `pipelines create` commands. ([#3870](https://github.com/databricks/cli/pull/3870))
89

910
### Dependency updates
1011

acceptance/bundle/refschema/out.fields.txt

Lines changed: 6 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ resources.alerts.*.display_name string ALL
55
resources.alerts.*.effective_run_as *sql.AlertV2RunAs ALL
66
resources.alerts.*.effective_run_as.service_principal_name string ALL
77
resources.alerts.*.effective_run_as.user_name string ALL
8-
resources.alerts.*.evaluation *sql.AlertV2Evaluation ALL
8+
resources.alerts.*.evaluation sql.AlertV2Evaluation ALL
99
resources.alerts.*.evaluation.comparison_operator sql.ComparisonOperator ALL
1010
resources.alerts.*.evaluation.empty_result_state sql.AlertEvaluationState ALL
1111
resources.alerts.*.evaluation.last_evaluated_at string ALL
@@ -16,7 +16,7 @@ resources.alerts.*.evaluation.notification.subscriptions []sql.AlertV2Subscripti
1616
resources.alerts.*.evaluation.notification.subscriptions[*] sql.AlertV2Subscription ALL
1717
resources.alerts.*.evaluation.notification.subscriptions[*].destination_id string ALL
1818
resources.alerts.*.evaluation.notification.subscriptions[*].user_email string ALL
19-
resources.alerts.*.evaluation.source *sql.AlertV2OperandColumn ALL
19+
resources.alerts.*.evaluation.source sql.AlertV2OperandColumn ALL
2020
resources.alerts.*.evaluation.source.aggregation sql.Aggregation ALL
2121
resources.alerts.*.evaluation.source.display string ALL
2222
resources.alerts.*.evaluation.source.name string ALL
@@ -33,7 +33,7 @@ resources.alerts.*.evaluation.threshold.value.string_value string ALL
3333
resources.alerts.*.id string ALL
3434
resources.alerts.*.lifecycle resources.Lifecycle INPUT
3535
resources.alerts.*.lifecycle.prevent_destroy bool INPUT
36-
resources.alerts.*.lifecycle_state sql.LifecycleState ALL
36+
resources.alerts.*.lifecycle_state sql.AlertLifecycleState ALL
3737
resources.alerts.*.modified_status string INPUT
3838
resources.alerts.*.owner_user_name string ALL
3939
resources.alerts.*.parent_path string ALL
@@ -48,7 +48,7 @@ resources.alerts.*.run_as *sql.AlertV2RunAs ALL
4848
resources.alerts.*.run_as.service_principal_name string ALL
4949
resources.alerts.*.run_as.user_name string ALL
5050
resources.alerts.*.run_as_user_name string ALL
51-
resources.alerts.*.schedule *sql.CronSchedule ALL
51+
resources.alerts.*.schedule sql.CronSchedule ALL
5252
resources.alerts.*.schedule.pause_status sql.SchedulePauseStatus ALL
5353
resources.alerts.*.schedule.quartz_cron_schedule string ALL
5454
resources.alerts.*.schedule.timezone_id string ALL
@@ -1523,12 +1523,6 @@ resources.jobs.*.settings.trigger.pause_status jobs.PauseStatus REMOTE
15231523
resources.jobs.*.settings.trigger.periodic *jobs.PeriodicTriggerConfiguration REMOTE
15241524
resources.jobs.*.settings.trigger.periodic.interval int REMOTE
15251525
resources.jobs.*.settings.trigger.periodic.unit jobs.PeriodicTriggerConfigurationTimeUnit REMOTE
1526-
resources.jobs.*.settings.trigger.table *jobs.TableUpdateTriggerConfiguration REMOTE
1527-
resources.jobs.*.settings.trigger.table.condition jobs.Condition REMOTE
1528-
resources.jobs.*.settings.trigger.table.min_time_between_triggers_seconds int REMOTE
1529-
resources.jobs.*.settings.trigger.table.table_names []string REMOTE
1530-
resources.jobs.*.settings.trigger.table.table_names[*] string REMOTE
1531-
resources.jobs.*.settings.trigger.table.wait_after_last_change_seconds int REMOTE
15321526
resources.jobs.*.settings.trigger.table_update *jobs.TableUpdateTriggerConfiguration REMOTE
15331527
resources.jobs.*.settings.trigger.table_update.condition jobs.Condition REMOTE
15341528
resources.jobs.*.settings.trigger.table_update.min_time_between_triggers_seconds int REMOTE
@@ -2194,12 +2188,6 @@ resources.jobs.*.trigger.pause_status jobs.PauseStatus INPUT STATE
21942188
resources.jobs.*.trigger.periodic *jobs.PeriodicTriggerConfiguration INPUT STATE
21952189
resources.jobs.*.trigger.periodic.interval int INPUT STATE
21962190
resources.jobs.*.trigger.periodic.unit jobs.PeriodicTriggerConfigurationTimeUnit INPUT STATE
2197-
resources.jobs.*.trigger.table *jobs.TableUpdateTriggerConfiguration INPUT STATE
2198-
resources.jobs.*.trigger.table.condition jobs.Condition INPUT STATE
2199-
resources.jobs.*.trigger.table.min_time_between_triggers_seconds int INPUT STATE
2200-
resources.jobs.*.trigger.table.table_names []string INPUT STATE
2201-
resources.jobs.*.trigger.table.table_names[*] string INPUT STATE
2202-
resources.jobs.*.trigger.table.wait_after_last_change_seconds int INPUT STATE
22032191
resources.jobs.*.trigger.table_update *jobs.TableUpdateTriggerConfiguration INPUT STATE
22042192
resources.jobs.*.trigger.table_update.condition jobs.Condition INPUT STATE
22052193
resources.jobs.*.trigger.table_update.min_time_between_triggers_seconds int INPUT STATE
@@ -2869,6 +2857,7 @@ resources.pipelines.*.spec.trigger.cron *pipelines.CronTrigger REMOTE
28692857
resources.pipelines.*.spec.trigger.cron.quartz_cron_schedule string REMOTE
28702858
resources.pipelines.*.spec.trigger.cron.timezone_id string REMOTE
28712859
resources.pipelines.*.spec.trigger.manual *pipelines.ManualTrigger REMOTE
2860+
resources.pipelines.*.spec.usage_policy_id string REMOTE
28722861
resources.pipelines.*.state pipelines.PipelineState REMOTE
28732862
resources.pipelines.*.storage string INPUT STATE
28742863
resources.pipelines.*.tags map[string]string INPUT STATE
@@ -2880,6 +2869,7 @@ resources.pipelines.*.trigger.cron.quartz_cron_schedule string INPUT STATE
28802869
resources.pipelines.*.trigger.cron.timezone_id string INPUT STATE
28812870
resources.pipelines.*.trigger.manual *pipelines.ManualTrigger INPUT STATE
28822871
resources.pipelines.*.url string INPUT
2872+
resources.pipelines.*.usage_policy_id string INPUT STATE
28832873
resources.pipelines.*.permissions.object_id string ALL
28842874
resources.pipelines.*.permissions.permissions []iam.AccessControlRequest ALL
28852875
resources.pipelines.*.permissions.permissions[*] iam.AccessControlRequest ALL

acceptance/cmd/account/account-help/output.txt

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,8 +7,11 @@ Usage:
77

88
Identity and Access Management
99
access-control These APIs manage access rules on resources in an account.
10+
groups Groups simplify identity management, making it easier to assign access to Databricks account, data, and other securable objects.
1011
groups-v2 Groups simplify identity management, making it easier to assign access to Databricks account, data, and other securable objects.
12+
service-principals Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms.
1113
service-principals-v2 Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms.
14+
users User identities recognized by Databricks and represented by email addresses.
1215
users-v2 User identities recognized by Databricks and represented by email addresses.
1316
workspace-assignment The Workspace Permission Assignment API allows you to manage workspace permissions for principals in your account.
1417

acceptance/help/output.txt

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -87,6 +87,7 @@ Unity Catalog
8787
Delta Sharing
8888
providers A data provider is an object representing the organization in the real world who shares the data.
8989
recipient-activation The Recipient Activation API is only applicable in the open sharing model where the recipient object has the authentication type of TOKEN.
90+
recipient-federation-policies The Recipient Federation Policies APIs are only applicable in the open sharing model where the recipient object has the authentication type of OIDC_RECIPIENT, enabling data sharing from Databricks to non-Databricks recipients.
9091
recipients A recipient is an object you create using :method:recipients/create to represent an organization which you want to allow access shares.
9192
shares A share is a container instantiated with :method:shares/create.
9293

@@ -154,6 +155,7 @@ Additional Commands:
154155
auth Authentication related commands
155156
completion Generate the autocompletion script for the specified shell
156157
configure Configure authentication
158+
data-quality Manage the data quality of Unity Catalog objects (currently support schema and table).
157159
help Help about any command
158160
labs Manage Databricks Labs installations
159161
tag-policies The Tag Policy API allows you to manage policies for governed tags in Databricks.

bundle/direct/dresources/pipeline.go

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -58,6 +58,7 @@ func (*ResourcePipeline) RemapState(p *pipelines.GetPipelineResponse) *pipelines
5858
Tags: spec.Tags,
5959
Target: spec.Target,
6060
Trigger: spec.Trigger,
61+
UsagePolicyId: spec.UsagePolicyId,
6162
ForceSendFields: filterFields[pipelines.CreatePipeline](spec.ForceSendFields, "AllowDuplicateNames", "DryRun", "RunAs", "Id"),
6263
}
6364
}
@@ -106,6 +107,7 @@ func (r *ResourcePipeline) DoUpdate(ctx context.Context, id string, config *pipe
106107
Tags: config.Tags,
107108
Target: config.Target,
108109
Trigger: config.Trigger,
110+
UsagePolicyId: config.UsagePolicyId,
109111
PipelineId: id,
110112
ForceSendFields: filterFields[pipelines.EditPipeline](config.ForceSendFields),
111113
}

bundle/internal/schema/annotations_openapi.yml

Lines changed: 18 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -589,8 +589,6 @@ github.com/databricks/cli/bundle/config/resources.Pipeline:
589589
"budget_policy_id":
590590
"description": |-
591591
Budget policy of this pipeline.
592-
"x-databricks-preview": |-
593-
PRIVATE
594592
"catalog":
595593
"description": |-
596594
A catalog in Unity Catalog to publish data from this pipeline to. If `target` is specified, tables in this pipeline are published to a `target` schema inside `catalog` (for example, `catalog`.`target`.`table`). If `target` is not specified, no data is published to Unity Catalog.
@@ -687,6 +685,11 @@ github.com/databricks/cli/bundle/config/resources.Pipeline:
687685
Which pipeline trigger to use. Deprecated: Use `continuous` instead.
688686
"deprecation_message": |-
689687
This field is deprecated
688+
"usage_policy_id":
689+
"description": |-
690+
Usage policy of this pipeline.
691+
"x-databricks-preview": |-
692+
PRIVATE
690693
github.com/databricks/cli/bundle/config/resources.QualityMonitor:
691694
"assets_dir":
692695
"description": |-
@@ -2462,6 +2465,10 @@ github.com/databricks/databricks-sdk-go/service/jobs.AuthenticationMethod:
24622465
- |-
24632466
PAT
24642467
github.com/databricks/databricks-sdk-go/service/jobs.CleanRoomsNotebookTask:
2468+
"_":
2469+
"description": |-
2470+
Clean Rooms notebook task for V1 Clean Room service (GA).
2471+
Replaces the deprecated CleanRoomNotebookTask (defined above) which was for V0 service.
24652472
"clean_room_name":
24662473
"description": |-
24672474
The clean room that the notebook belongs to.
@@ -3060,6 +3067,8 @@ github.com/databricks/databricks-sdk-go/service/jobs.RunJobTask:
30603067
"dbt_commands":
30613068
"description": |-
30623069
An array of commands to execute for jobs with the dbt task, for example `"dbt_commands": ["dbt deps", "dbt seed", "dbt deps", "dbt seed", "dbt run"]`
3070+
3071+
⚠ **Deprecation note** Use [job parameters](https://docs.databricks.com/jobs/job-parameters.html#job-parameter-pushdown) to pass information down to tasks.
30633072
"deprecation_message": |-
30643073
This field is deprecated
30653074
"x-databricks-preview": |-
@@ -3072,7 +3081,7 @@ github.com/databricks/databricks-sdk-go/service/jobs.RunJobTask:
30723081
jar_params cannot be specified in conjunction with notebook_params.
30733082
The JSON representation of this field (for example `{"jar_params":["john doe","35"]}`) cannot exceed 10,000 bytes.
30743083
3075-
Use [Task parameter variables](https://docs.databricks.com/jobs.html#parameter-variables) to set parameters containing information about job runs.
3084+
⚠ **Deprecation note** Use [job parameters](https://docs.databricks.com/jobs/job-parameters.html#job-parameter-pushdown) to pass information down to tasks.
30763085
"deprecation_message": |-
30773086
This field is deprecated
30783087
"x-databricks-preview": |-
@@ -3092,7 +3101,7 @@ github.com/databricks/databricks-sdk-go/service/jobs.RunJobTask:
30923101
30933102
notebook_params cannot be specified in conjunction with jar_params.
30943103
3095-
Use [Task parameter variables](https://docs.databricks.com/jobs.html#parameter-variables) to set parameters containing information about job runs.
3104+
⚠ **Deprecation note** Use [job parameters](https://docs.databricks.com/jobs/job-parameters.html#job-parameter-pushdown) to pass information down to tasks.
30963105
30973106
The JSON representation of this field (for example `{"notebook_params":{"name":"john doe","age":"35"}}`) cannot exceed 10,000 bytes.
30983107
"deprecation_message": |-
@@ -3114,7 +3123,7 @@ github.com/databricks/databricks-sdk-go/service/jobs.RunJobTask:
31143123
the parameters specified in job setting. The JSON representation of this field (for example `{"python_params":["john doe","35"]}`)
31153124
cannot exceed 10,000 bytes.
31163125
3117-
Use [Task parameter variables](https://docs.databricks.com/jobs.html#parameter-variables) to set parameters containing information about job runs.
3126+
⚠ **Deprecation note** Use [job parameters](https://docs.databricks.com/jobs/job-parameters.html#job-parameter-pushdown) to pass information down to tasks.
31183127
31193128
Important
31203129
@@ -3131,7 +3140,7 @@ github.com/databricks/databricks-sdk-go/service/jobs.RunJobTask:
31313140
parameters specified in job setting. The JSON representation of this field (for example `{"python_params":["john doe","35"]}`)
31323141
cannot exceed 10,000 bytes.
31333142
3134-
Use [Task parameter variables](https://docs.databricks.com/jobs.html#parameter-variables) to set parameters containing information about job runs
3143+
⚠ **Deprecation note** Use [job parameters](https://docs.databricks.com/jobs/job-parameters.html#job-parameter-pushdown) to pass information down to tasks.
31353144
31363145
Important
31373146
@@ -3144,6 +3153,8 @@ github.com/databricks/databricks-sdk-go/service/jobs.RunJobTask:
31443153
"sql_params":
31453154
"description": |-
31463155
A map from keys to values for jobs with SQL task, for example `"sql_params": {"name": "john doe", "age": "35"}`. The SQL alert task does not support custom parameters.
3156+
3157+
⚠ **Deprecation note** Use [job parameters](https://docs.databricks.com/jobs/job-parameters.html#job-parameter-pushdown) to pass information down to tasks.
31473158
"deprecation_message": |-
31483159
This field is deprecated
31493160
"x-databricks-preview": |-
@@ -3511,13 +3522,6 @@ github.com/databricks/databricks-sdk-go/service/jobs.TriggerSettings:
35113522
"periodic":
35123523
"description": |-
35133524
Periodic trigger settings.
3514-
"table":
3515-
"description": |-
3516-
Old table trigger settings name. Deprecated in favor of `table_update`.
3517-
"deprecation_message": |-
3518-
This field is deprecated
3519-
"x-databricks-preview": |-
3520-
PRIVATE
35213525
"table_update": {}
35223526
github.com/databricks/databricks-sdk-go/service/jobs.Webhook:
35233527
"id": {}
@@ -3566,7 +3570,7 @@ github.com/databricks/databricks-sdk-go/service/pipelines.CronTrigger:
35663570
github.com/databricks/databricks-sdk-go/service/pipelines.DayOfWeek:
35673571
"_":
35683572
"description": |-
3569-
Days of week in which the restart is allowed to happen (within a five-hour window starting at start_hour).
3573+
Days of week in which the window is allowed to happen.
35703574
If not specified all days of the week will be used.
35713575
"enum":
35723576
- |-

bundle/internal/validation/generated/enum_fields.go

Lines changed: 0 additions & 1 deletion
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

0 commit comments

Comments
 (0)