Skip to content
This repository is currently being migrated. It's locked while the migration is in progress.

Conversation

@CaymanWilliams
Copy link

rebase into main test

parthban-db and others added 30 commits July 3, 2024 08:29
## Changes
<!-- Summary of your changes that are easy to understand -->
Changed `pathlib.Path` with the `pathlib.PurePosixPath` in
`/databricks/sdk/mixins/files.py` which always use linux path separators
regardless of the OS that it is running on. Fixes (databricks#660)

## Tests
<!-- 
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [x] `make test` run locally
- [x] `make fmt` applied
- [ ] relevant integration tests applied
## Changes
<!-- Summary of your changes that are easy to understand -->
Added a check for trailing slash in the host url. Fixes (databricks#661)

## Tests
<!-- 
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [ ] `make test` run locally
- [ ] `make fmt` applied
- [ ] relevant integration tests applied

---------

Signed-off-by: Parth Bansal <[email protected]>
## Changes
<!-- Summary of your changes that are easy to understand -->
Changed workflow such that tests will run on windows.

## Tests
<!-- 
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [ ] `make test` run locally
- [ ] `make fmt` applied
- [ ] relevant integration tests applied
## Changes
<!-- Summary of your changes that are easy to understand -->
Remove duplicate ubuntu tests

## Tests
<!-- 
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [ ] `make test` run locally
- [ ] `make fmt` applied
- [ ] relevant integration tests applied
## Changes
Ports databricks/databricks-sdk-go#925 to the
Python SDK.

Partners of Databricks need a mechanism to register themselves in
libraries or applications that they write. In this way, requests made by
users of those libraries will include sufficient information to link
those requests to the original users.

This PR adds a new `useragent` module with functions to manipulate the
user agent.
* `product()`: returns the globally configured product & version.
* `with_product(product: str, product_version: str)`: configure the
global product & version.
* `extra()`: returns the globally configured extra user agent metadata.
* `with_extra(key: str, value: str)`: add an extra entry to the global
extra user agent metadata.
* `with_partner(partner: str)`: add a partner to the global extra user
agent metadata.
* `to_string(product_override: Optional[Tuple[str, str]]=None,
other_info: Optional[List[Tuple[str, str]]] = None): str`: return the
User-Agent header as a string.


One new function here is `with_partner`, which can be used by a partner
to add partner information to the User-Agent header for requests made by
the SDK. The new header will have the form `partner/<parther id>`. The
partner identifier is opaque for the SDK, but it must be alphanumeric.

This PR also removes the requirement that a user agent entry contain
only a single copy of each key. This allows multiple partners to
register in the same library or application.

In this PR, I've also refactored the user agent library to be more
static, aligning it with the Go and Java SDKs. This makes it easier to
maintain and ensures similar behavior between all 3 SDKs. Note that this
SDK has extra functionality that doesn't exist in the Go and Java SDKs,
namely config-level user agent info; that is preserved here.

## Tests
Unit tests were added to verify that the user agent contains all
expected parts and supports multiple partners.

- [ ] `make test` run locally
- [ ] `make fmt` applied
- [ ] relevant integration tests applied
## Changes
<!-- Summary of your changes that are easy to understand -->
Fix auth tests for windows. 

- Added a powershell script as bash script doesn't run on windows
- change 'COMSPEC' enviornment variable to run commands on powershell
- Use 'USERPROFILE' instead of 'HOME' as it is alternative of 'HOME' in
windows

## Tests
<!-- 
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [ ] `make test` run locally
- [ ] `make fmt` applied
- [ ] relevant integration tests applied
## Changes
<!-- Summary of your changes that are easy to understand -->
Fix `tests/integration/test_files.py::test_local_io` for windows. This
PR is part of fixing the test for windows
## Tests
<!-- 
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [ ] `make test` run locally
- [ ] `make fmt` applied
- [ ] relevant integration tests applied
## Changes
<!-- Summary of your changes that are easy to understand -->
Fix for workflow that are cancelling due to failed workflow

## Tests
<!-- 
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [ ] `make test` run locally
- [ ] `make fmt` applied
- [ ] relevant integration tests applied
## Changes
<!-- Summary of your changes that are easy to understand -->
Fix `test_core.py` for windows. 

## Tests
<!-- 
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [ ] `make test` run locally
- [ ] `make fmt` applied
- [ ] relevant integration tests applied
## Changes
Improve Changelog by grouping changes and enforce tag in PRs

## Tests
- [X] `make test` run locally
- [X] `make fmt` applied
- [ ] relevant integration tests applied
- [X] Recreate old changelog

```

## 0.30.0

### Other Changes

 * Add Windows WorkFlow ([databricks#692](databricks#692)).
 * Check trailing slash in host url ([databricks#681](databricks#681)).
 * Fix auth tests for windows. ([databricks#697](databricks#697)).
 * Remove duplicate ubuntu tests ([databricks#693](databricks#693)).
 * Support partners in SDK ([databricks#648](databricks#648)).
 * fix windows path ([databricks#660](databricks#660)) ([databricks#673](databricks#673)).


### API Changes:

 * Added [w.serving_endpoints_data_plane](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/serving_endpoints_data_plane.html) workspace-level service.
 * Added `deploy()` and `start()` methods for [w.apps](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/apps.html) workspace-level service.
 * Added `batch_get()` method for [w.consumer_listings](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/consumer_listings.html) workspace-level service.
 * Added `batch_get()` method for [w.consumer_providers](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/consumer_providers.html) workspace-level service.
 * Added `create_schedule()`, `create_subscription()`, `delete_schedule()`, `delete_subscription()`, `get_schedule()`, `get_subscription()`, `list()`, `list_schedules()`, `list_subscriptions()` and `update_schedule()` methods for [w.lakeview](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/lakeview.html) workspace-level service.
 * Added `query_next_page()` method for [w.vector_search_indexes](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/vector_search_indexes.html) workspace-level service.
 * Added `databricks.sdk.service.serving.AppDeploymentMode`, `databricks.sdk.service.serving.ModelDataPlaneInfo` and `databricks.sdk.service.serving.StartAppRequest` dataclasses.
 * Added `databricks.sdk.service.catalog.CatalogIsolationMode` and `databricks.sdk.service.catalog.ListAccountStorageCredentialsResponse` dataclasses.
 * Added `databricks.sdk.service.dashboards.CreateScheduleRequest`, `databricks.sdk.service.dashboards.CreateSubscriptionRequest`, `databricks.sdk.service.dashboards.CronSchedule`, `databricks.sdk.service.dashboards.DashboardView`, `databricks.sdk.service.dashboards.DeleteScheduleRequest`, `any`, `databricks.sdk.service.dashboards.DeleteSubscriptionRequest`, `any`, `databricks.sdk.service.dashboards.GetScheduleRequest`, `databricks.sdk.service.dashboards.GetSubscriptionRequest`, `databricks.sdk.service.dashboards.ListDashboardsRequest`, `databricks.sdk.service.dashboards.ListDashboardsResponse`, `databricks.sdk.service.dashboards.ListSchedulesRequest`, `databricks.sdk.service.dashboards.ListSchedulesResponse`, `databricks.sdk.service.dashboards.ListSubscriptionsRequest`, `databricks.sdk.service.dashboards.ListSubscriptionsResponse`, `databricks.sdk.service.dashboards.Schedule`, `databricks.sdk.service.dashboards.SchedulePauseStatus`, `databricks.sdk.service.dashboards.Subscriber`, `databricks.sdk.service.dashboards.Subscription`, `databricks.sdk.service.dashboards.SubscriptionSubscriberDestination`, `databricks.sdk.service.dashboards.SubscriptionSubscriberUser` and `databricks.sdk.service.dashboards.UpdateScheduleRequest` dataclasses.
 * Added `databricks.sdk.service.jobs.PeriodicTriggerConfiguration` and `databricks.sdk.service.jobs.PeriodicTriggerConfigurationTimeUnit` dataclasses.
 * Added `databricks.sdk.service.marketplace.BatchGetListingsRequest`, `databricks.sdk.service.marketplace.BatchGetListingsResponse`, `databricks.sdk.service.marketplace.BatchGetProvidersRequest`, `databricks.sdk.service.marketplace.BatchGetProvidersResponse`, `databricks.sdk.service.marketplace.ProviderIconFile`, `databricks.sdk.service.marketplace.ProviderIconType` and `databricks.sdk.service.marketplace.ProviderListingSummaryInfo` dataclasses.
 * Added `databricks.sdk.service.oauth2.DataPlaneInfo` dataclass.
 * Added `databricks.sdk.service.vectorsearch.QueryVectorIndexNextPageRequest` dataclass.
 * Added `isolation_mode` field for `databricks.sdk.service.catalog.ExternalLocationInfo`.
 * Added `max_results` and `page_token` fields for `databricks.sdk.service.catalog.ListCatalogsRequest`.
 * Added `next_page_token` field for `databricks.sdk.service.catalog.ListCatalogsResponse`.
 * Added `table_serving_url` field for `databricks.sdk.service.catalog.OnlineTable`.
 * Added `isolation_mode` field for `databricks.sdk.service.catalog.StorageCredentialInfo`.
 * Added `isolation_mode` field for `databricks.sdk.service.catalog.UpdateExternalLocation`.
 * Added `isolation_mode` field for `databricks.sdk.service.catalog.UpdateStorageCredential`.
 * Added `termination_category` field for `databricks.sdk.service.jobs.ForEachTaskErrorMessageStats`.
 * Added `on_streaming_backlog_exceeded` field for `databricks.sdk.service.jobs.JobEmailNotifications`.
 * Added `environment_key` field for `databricks.sdk.service.jobs.RunTask`.
 * Added `environments` field for `databricks.sdk.service.jobs.SubmitRun`.
 * Added `dbt_task` and `environment_key` fields for `databricks.sdk.service.jobs.SubmitTask`.
 * Added `on_streaming_backlog_exceeded` field for `databricks.sdk.service.jobs.TaskEmailNotifications`.
 * Added `periodic` field for `databricks.sdk.service.jobs.TriggerSettings`.
 * Added `on_streaming_backlog_exceeded` field for `databricks.sdk.service.jobs.WebhookNotifications`.
 * Added `provider_summary` field for `databricks.sdk.service.marketplace.Listing`.
 * Added `service_principal_id` and `service_principal_name` fields for `databricks.sdk.service.serving.App`.
 * Added `mode` field for `databricks.sdk.service.serving.AppDeployment`.
 * Added `mode` field for `databricks.sdk.service.serving.CreateAppDeploymentRequest`.
 * Added `data_plane_info` field for `databricks.sdk.service.serving.ServingEndpointDetailed`.
 * Added `query_type` field for `databricks.sdk.service.vectorsearch.QueryVectorIndexRequest`.
 * Added `next_page_token` field for `databricks.sdk.service.vectorsearch.QueryVectorIndexResponse`.
 * Changed `list()` method for [a.account_storage_credentials](https://databricks-sdk-py.readthedocs.io/en/latest/account/account_storage_credentials.html) account-level service to return `databricks.sdk.service.catalog.ListAccountStorageCredentialsResponse` dataclass.
 * Changed `isolation_mode` field for `databricks.sdk.service.catalog.CatalogInfo` to `databricks.sdk.service.catalog.CatalogIsolationMode` dataclass.
 * Changed `isolation_mode` field for `databricks.sdk.service.catalog.UpdateCatalog` to `databricks.sdk.service.catalog.CatalogIsolationMode` dataclass.
 * Removed `create_deployment()` method for [w.apps](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/apps.html) workspace-level service.
 * Removed `condition_task`, `dbt_task`, `notebook_task`, `pipeline_task`, `python_wheel_task`, `run_job_task`, `spark_jar_task`, `spark_python_task`, `spark_submit_task` and `sql_task` fields for `databricks.sdk.service.jobs.SubmitRun`.

OpenAPI SHA: 7437dabb9dadee402c1fc060df4c1ce8cc5369f0, Date: 2024-06-24
```
## Changes
Add Release tag

## Tests
<!-- 
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [ ] `make test` run locally
- [ ] `make fmt` applied
- [ ] relevant integration tests applied
…cks#707)

## Changes
Move PR message validation to a separate workflow

## Tests
Updated title for this PR
## Changes
Add DataPlane support

## Tests
- [X] `make test` run locally
- [X] `make fmt` applied
- [ ] relevant integration tests applied
- [X] Manual test against staging workspace (prod workspaces don't
support DataPlane APIs)
…ks#709)

## Changes
Trigger the validate workflow in the merge queue
## Changes
Port of databricks/databricks-sdk-go#910 to the
Python SDK.

In order to use Azure U2M or M2M authentication with the Databricks SDK,
users must request a token from the Entra ID instance that the
underlying workspace or account belongs to, as Databricks rejects
requests to workspaces with a token from a different Entra ID tenant.
However, with Azure CLI auth, it is possible that a user is logged into
multiple tenants at the same time. Currently, the SDK uses the
subscription ID from the configured Azure Resource ID for the workspace
when issuing the `az account get-access-token` command. However, when
users don't specify the resource ID, the SDK simply fetches a token for
the active subscription for the user. If the active subscription is in a
different tenant than the workspace, users will see an error such as:

```
io.jsonwebtoken.IncorrectClaimException: Expected iss claim to be: https://sts.windows.net/72f988bf-86f1-41af-91ab-2d7cd011db47/, but was: https://sts.windows.net/e3fe3f22-4b98-4c04-82cc-d8817d1b17da/
```

This PR modifies Azure CLI and Azure SP credential providers to attempt
to load the tenant ID of the workspace if not provided before
authenticating. Currently, there are no unauthenticated endpoints that
the tenant ID can be directly fetched from. However, the tenant ID is
indirectly exposed via the redirect URL used when logging into a
workspace. In this PR, we fetch the tenant ID from this endpoint and
configure it if not already set.

Here, we lazily fetch the tenant ID only in the auth methods that need
it. This prevents us from making any unnecessary requests if these Azure
credential providers are not needed.

## Tests
Unit tests check that the tenant ID is fetched automatically if not
specified for an azure workspace when authenticating with client
ID/secret or with the CLI.

- [x] `make test` run locally
- [x] `make fmt` applied
- [x] relevant integration tests applied
## Changes
<!-- Summary of your changes that are easy to understand -->
Update OpenAPI spec

## Tests
<!-- 
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [X] `make test` run locally
- [X] `make fmt` applied
- [x] relevant integration tests applied
…#714)

## Changes
<!-- Summary of your changes that are easy to understand -->
Added tests to make sure regeneration is not going to break API version
pinning: databricks/databricks-sdk-go#993

## Tests
<!-- 
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [x] `make test` run locally
- [x] `make fmt` applied
- [ ] relevant integration tests applied
…atabricks#719)

## Changes
This PR fixes the current failing integration tests for the Python SDK,
unblocking their release.

There are two issues:
1. get_workspace_client fails in our integration tests because we call
it with a workspace that is not UC-enabled. Because tests are
authenticated as service principals, and it isn't possible to add
account-level service principals to non-UC workspaces, this call fails.
I address this by running this test against a UC-enabled workspace.
2. test_runtime_auth_from_jobs fails because a new LTS DBR version was
released (15.4) that doesn't support DBFS library installations. To
address this, I have created two tests:
test_runtime_auth_from_jobs_dbfs, which tests native auth using the SDK
installed from DBFS up to LTS 14.3, and
test_runtime_auth_from_jobs_volumes, which does the same with the SDK
installed from a volume.

## Tests
All integration tests passed (retriggered the GCP integration test
locally after adding single user data security mode).

- [ ] `make test` run locally
- [ ] `make fmt` applied
- [ ] relevant integration tests applied
…s#721)

## Changes
The current integration test for recursive workspace listing is very
slow because it lists all resources in a very large directory (the
integration test user's home folder). To decrease the time this test
takes, we can simply create a directory with a file and a subdirectory
with another file. This means the test requires only two API calls to
complete.

## Tests
<!-- 
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [ ] `make test` run locally
- [ ] `make fmt` applied
- [ ] relevant integration tests applied
## Changes
To enable the release of the Apps package, we need to manually add it to
our doc generation.

Going forward, this should be added to the internal API specification.

## Tests
<!-- 
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [x] Codegen tool runs successfully on commit
88571b688969bc4509fb520d86d161eb20c3d662 of the API specification from
this PR.
### New Features and Improvements

* Add DataPlane support
([databricks#700](databricks#700)).
* Support partners in SDK
([databricks#648](databricks#648)).


### Bug Fixes

* Check trailing slash in host url
([databricks#681](databricks#681)).
* Decrease runtime of recursive workspace listing test
([databricks#721](databricks#721)).
* Fix test_get_workspace_client and test_runtime_auth_from_jobs
([databricks#719](databricks#719)).
* Infer Azure tenant ID if not set
([databricks#638](databricks#638)).


### Internal Changes

* Add Release tag and Workflow fix
([databricks#704](databricks#704)).
* Add apps package in docgen
([databricks#722](databricks#722)).
* Fix processing of `quoted` titles
([databricks#712](databricks#712)).
* Improve Changelog by grouping changes
([databricks#703](databricks#703)).
* Move PR message validation to a separate workflow
([databricks#707](databricks#707)).
* Test that Jobs API endpoints are pinned to 2.1
([databricks#714](databricks#714)).
* Trigger the validate workflow in the merge queue
([databricks#709](databricks#709)).
* Update OpenAPI spec
([databricks#715](databricks#715)).


### Other Changes

* Add Windows WorkFlow
([databricks#692](databricks#692)).
* Fix auth tests for windows.
([databricks#697](databricks#697)).
* Fix for cancelled workflow
([databricks#701](databricks#701)).
* Fix test_core for windows
([databricks#702](databricks#702)).
* Fix test_local_io for windows
([databricks#695](databricks#695)).
* Remove duplicate ubuntu tests
([databricks#693](databricks#693)).
* fix windows path
([databricks#660](databricks#660))
([databricks#673](databricks#673)).


### API Changes:

 * Added `databricks.sdk.service.apps` package.
* Added
[a.usage_dashboards](https://databricks-sdk-py.readthedocs.io/en/latest/account/usage_dashboards.html)
account-level service.
* Added
[w.alerts_legacy](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/alerts_legacy.html)
workspace-level service,
[w.queries_legacy](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/queries_legacy.html)
workspace-level service and
[w.query_visualizations_legacy](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/query_visualizations_legacy.html)
workspace-level service.
* Added
[w.genie](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/genie.html)
workspace-level service.
* Added
[w.notification_destinations](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/notification_destinations.html)
workspace-level service.
* Added `update()` method for
[w.clusters](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/clusters.html)
workspace-level service.
* Added `list_visualizations()` method for
[w.queries](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/queries.html)
workspace-level service.
* Added `databricks.sdk.service.catalog.GetBindingsSecurableType` and
`databricks.sdk.service.catalog.UpdateBindingsSecurableType`
dataclasses.
* Added `databricks.sdk.service.billing.ActionConfiguration`,
`databricks.sdk.service.billing.ActionConfigurationType`,
`databricks.sdk.service.billing.AlertConfiguration`,
`databricks.sdk.service.billing.AlertConfigurationQuantityType`,
`databricks.sdk.service.billing.AlertConfigurationTimePeriod`,
`databricks.sdk.service.billing.AlertConfigurationTriggerType`,
`databricks.sdk.service.billing.BudgetConfiguration`,
`databricks.sdk.service.billing.BudgetConfigurationFilter`,
`databricks.sdk.service.billing.BudgetConfigurationFilterClause`,
`databricks.sdk.service.billing.BudgetConfigurationFilterOperator`,
`databricks.sdk.service.billing.BudgetConfigurationFilterTagClause`,
`databricks.sdk.service.billing.BudgetConfigurationFilterWorkspaceIdClause`,
`databricks.sdk.service.billing.CreateBillingUsageDashboardRequest`,
`databricks.sdk.service.billing.CreateBillingUsageDashboardResponse`,
`databricks.sdk.service.billing.CreateBudgetConfigurationBudget`,
`databricks.sdk.service.billing.CreateBudgetConfigurationBudgetActionConfigurations`,
`databricks.sdk.service.billing.CreateBudgetConfigurationBudgetAlertConfigurations`,
`databricks.sdk.service.billing.CreateBudgetConfigurationRequest`,
`databricks.sdk.service.billing.CreateBudgetConfigurationResponse`,
`databricks.sdk.service.billing.DeleteBudgetConfigurationRequest`,
`any`, `databricks.sdk.service.billing.GetBillingUsageDashboardRequest`,
`databricks.sdk.service.billing.GetBillingUsageDashboardResponse`,
`databricks.sdk.service.billing.GetBudgetConfigurationRequest`,
`databricks.sdk.service.billing.GetBudgetConfigurationResponse`,
`databricks.sdk.service.billing.ListBudgetConfigurationsRequest`,
`databricks.sdk.service.billing.ListBudgetConfigurationsResponse`,
`databricks.sdk.service.billing.UpdateBudgetConfigurationBudget`,
`databricks.sdk.service.billing.UpdateBudgetConfigurationRequest`,
`databricks.sdk.service.billing.UpdateBudgetConfigurationResponse` and
`databricks.sdk.service.billing.UsageDashboardType` dataclasses.
* Added `databricks.sdk.service.compute.ListClustersFilterBy`,
`databricks.sdk.service.compute.ListClustersSortBy`,
`databricks.sdk.service.compute.ListClustersSortByDirection`,
`databricks.sdk.service.compute.ListClustersSortByField`,
`databricks.sdk.service.compute.UpdateCluster`,
`databricks.sdk.service.compute.UpdateClusterResource` and `any`
dataclasses.
* Added `databricks.sdk.service.dashboards.ExecuteMessageQueryRequest`,
`databricks.sdk.service.dashboards.GenieAttachment`,
`databricks.sdk.service.dashboards.GenieConversation`,
`databricks.sdk.service.dashboards.GenieCreateConversationMessageRequest`,
`databricks.sdk.service.dashboards.GenieGetConversationMessageRequest`,
`databricks.sdk.service.dashboards.GenieGetMessageQueryResultRequest`,
`databricks.sdk.service.dashboards.GenieGetMessageQueryResultResponse`,
`databricks.sdk.service.dashboards.GenieMessage`,
`databricks.sdk.service.dashboards.GenieStartConversationMessageRequest`,
`databricks.sdk.service.dashboards.GenieStartConversationResponse`,
`databricks.sdk.service.dashboards.MessageError`,
`databricks.sdk.service.dashboards.MessageErrorType`,
`databricks.sdk.service.dashboards.MessageStatus`,
`databricks.sdk.service.dashboards.QueryAttachment`,
`databricks.sdk.service.dashboards.Result` and
`databricks.sdk.service.dashboards.TextAttachment` dataclasses.
* Added `any`, `databricks.sdk.service.iam.MigratePermissionsRequest`
and `databricks.sdk.service.iam.MigratePermissionsResponse` dataclasses.
* Added `databricks.sdk.service.oauth2.ListCustomAppIntegrationsRequest`
and `databricks.sdk.service.oauth2.ListPublishedAppIntegrationsRequest`
dataclasses.
* Added `databricks.sdk.service.pipelines.IngestionPipelineDefinition`
and `databricks.sdk.service.pipelines.PipelineStateInfoHealth`
dataclasses.
* Added `databricks.sdk.service.serving.GoogleCloudVertexAiConfig`
dataclass.
* Added `databricks.sdk.service.settings.Config`,
`databricks.sdk.service.settings.CreateNotificationDestinationRequest`,
`databricks.sdk.service.settings.DeleteNotificationDestinationRequest`,
`databricks.sdk.service.settings.DestinationType`,
`databricks.sdk.service.settings.EmailConfig`, `any`,
`databricks.sdk.service.settings.GenericWebhookConfig`,
`databricks.sdk.service.settings.GetNotificationDestinationRequest`,
`databricks.sdk.service.settings.ListNotificationDestinationsRequest`,
`databricks.sdk.service.settings.ListNotificationDestinationsResponse`,
`databricks.sdk.service.settings.ListNotificationDestinationsResult`,
`databricks.sdk.service.settings.MicrosoftTeamsConfig`,
`databricks.sdk.service.settings.NotificationDestination`,
`databricks.sdk.service.settings.PagerdutyConfig`,
`databricks.sdk.service.settings.SlackConfig` and
`databricks.sdk.service.settings.UpdateNotificationDestinationRequest`
dataclasses.
* Added `databricks.sdk.service.sql.AlertCondition`,
`databricks.sdk.service.sql.AlertConditionOperand`,
`databricks.sdk.service.sql.AlertConditionThreshold`,
`databricks.sdk.service.sql.AlertOperandColumn`,
`databricks.sdk.service.sql.AlertOperandValue`,
`databricks.sdk.service.sql.AlertOperator`,
`databricks.sdk.service.sql.ClientCallContext`,
`databricks.sdk.service.sql.ContextFilter`,
`databricks.sdk.service.sql.CreateAlertRequest`,
`databricks.sdk.service.sql.CreateAlertRequestAlert`,
`databricks.sdk.service.sql.CreateQueryRequest`,
`databricks.sdk.service.sql.CreateQueryRequestQuery`,
`databricks.sdk.service.sql.CreateQueryVisualizationsLegacyRequest`,
`databricks.sdk.service.sql.CreateVisualizationRequest`,
`databricks.sdk.service.sql.CreateVisualizationRequestVisualization`,
`databricks.sdk.service.sql.DatePrecision`,
`databricks.sdk.service.sql.DateRange`,
`databricks.sdk.service.sql.DateRangeValue`,
`databricks.sdk.service.sql.DateRangeValueDynamicDateRange`,
`databricks.sdk.service.sql.DateValue`,
`databricks.sdk.service.sql.DateValueDynamicDate`,
`databricks.sdk.service.sql.DeleteAlertsLegacyRequest`,
`databricks.sdk.service.sql.DeleteQueriesLegacyRequest`,
`databricks.sdk.service.sql.DeleteQueryVisualizationsLegacyRequest`,
`databricks.sdk.service.sql.DeleteVisualizationRequest`, `any`,
`databricks.sdk.service.sql.EncodedText`,
`databricks.sdk.service.sql.EncodedTextEncoding`,
`databricks.sdk.service.sql.EnumValue`,
`databricks.sdk.service.sql.GetAlertsLegacyRequest`,
`databricks.sdk.service.sql.GetQueriesLegacyRequest`,
`databricks.sdk.service.sql.LegacyAlert`,
`databricks.sdk.service.sql.LegacyAlertState`,
`databricks.sdk.service.sql.LegacyQuery`,
`databricks.sdk.service.sql.LegacyVisualization`,
`databricks.sdk.service.sql.LifecycleState`,
`databricks.sdk.service.sql.ListAlertsRequest`,
`databricks.sdk.service.sql.ListAlertsResponse`,
`databricks.sdk.service.sql.ListAlertsResponseAlert`,
`databricks.sdk.service.sql.ListQueriesLegacyRequest`,
`databricks.sdk.service.sql.ListQueryObjectsResponse`,
`databricks.sdk.service.sql.ListQueryObjectsResponseQuery`,
`databricks.sdk.service.sql.ListVisualizationsForQueryRequest`,
`databricks.sdk.service.sql.ListVisualizationsForQueryResponse`,
`databricks.sdk.service.sql.NumericValue`,
`databricks.sdk.service.sql.QueryBackedValue`,
`databricks.sdk.service.sql.QueryParameter`,
`databricks.sdk.service.sql.QuerySource`,
`databricks.sdk.service.sql.QuerySourceDriverInfo`,
`databricks.sdk.service.sql.QuerySourceEntryPoint`,
`databricks.sdk.service.sql.QuerySourceJobManager`,
`databricks.sdk.service.sql.QuerySourceTrigger`,
`databricks.sdk.service.sql.RestoreQueriesLegacyRequest`,
`databricks.sdk.service.sql.RunAsMode`,
`databricks.sdk.service.sql.ServerlessChannelInfo`,
`databricks.sdk.service.sql.StatementResponse`,
`databricks.sdk.service.sql.TextValue`,
`databricks.sdk.service.sql.TrashAlertRequest`,
`databricks.sdk.service.sql.TrashQueryRequest`,
`databricks.sdk.service.sql.UpdateAlertRequest`,
`databricks.sdk.service.sql.UpdateAlertRequestAlert`,
`databricks.sdk.service.sql.UpdateQueryRequest`,
`databricks.sdk.service.sql.UpdateQueryRequestQuery`,
`databricks.sdk.service.sql.UpdateVisualizationRequest` and
`databricks.sdk.service.sql.UpdateVisualizationRequestVisualization`
dataclasses.
* Added `force` field for
`databricks.sdk.service.catalog.DeleteSchemaRequest`.
* Added `max_results` and `page_token` fields for
`databricks.sdk.service.catalog.GetBindingsRequest`.
* Added `include_aliases` field for
`databricks.sdk.service.catalog.GetByAliasRequest`.
* Added `include_aliases` field for
`databricks.sdk.service.catalog.GetModelVersionRequest`.
* Added `include_aliases` field for
`databricks.sdk.service.catalog.GetRegisteredModelRequest`.
* Added `max_results` and `page_token` fields for
`databricks.sdk.service.catalog.ListSystemSchemasRequest`.
* Added `next_page_token` field for
`databricks.sdk.service.catalog.ListSystemSchemasResponse`.
* Added `aliases` field for
`databricks.sdk.service.catalog.ModelVersionInfo`.
* Added `next_page_token` field for
`databricks.sdk.service.catalog.WorkspaceBindingsResponse`.
* Added `version` field for
`databricks.sdk.service.compute.GetPolicyFamilyRequest`.
* Added `filter_by`, `page_size`, `page_token` and `sort_by` fields for
`databricks.sdk.service.compute.ListClustersRequest`.
* Added `next_page_token` and `prev_page_token` fields for
`databricks.sdk.service.compute.ListClustersResponse`.
* Added `page_token` field for
`databricks.sdk.service.jobs.GetRunRequest`.
* Added `iterations`, `next_page_token` and `prev_page_token` fields for
`databricks.sdk.service.jobs.Run`.
* Added `create_time`, `created_by`, `creator_username` and `scopes`
fields for
`databricks.sdk.service.oauth2.GetCustomAppIntegrationOutput`.
* Added `next_page_token` field for
`databricks.sdk.service.oauth2.GetCustomAppIntegrationsOutput`.
* Added `create_time` and `created_by` fields for
`databricks.sdk.service.oauth2.GetPublishedAppIntegrationOutput`.
* Added `next_page_token` field for
`databricks.sdk.service.oauth2.GetPublishedAppIntegrationsOutput`.
* Added `enable_local_disk_encryption` field for
`databricks.sdk.service.pipelines.PipelineCluster`.
* Added `whl` field for
`databricks.sdk.service.pipelines.PipelineLibrary`.
* Added `health` field for
`databricks.sdk.service.pipelines.PipelineStateInfo`.
* Added `ai21labs_api_key_plaintext` field for
`databricks.sdk.service.serving.Ai21LabsConfig`.
* Added `aws_access_key_id_plaintext` and
`aws_secret_access_key_plaintext` fields for
`databricks.sdk.service.serving.AmazonBedrockConfig`.
* Added `anthropic_api_key_plaintext` field for
`databricks.sdk.service.serving.AnthropicConfig`.
* Added `cohere_api_base` and `cohere_api_key_plaintext` fields for
`databricks.sdk.service.serving.CohereConfig`.
* Added `databricks_api_token_plaintext` field for
`databricks.sdk.service.serving.DatabricksModelServingConfig`.
* Added `google_cloud_vertex_ai_config` field for
`databricks.sdk.service.serving.ExternalModel`.
* Added `microsoft_entra_client_secret_plaintext` and
`openai_api_key_plaintext` fields for
`databricks.sdk.service.serving.OpenAiConfig`.
* Added `palm_api_key_plaintext` field for
`databricks.sdk.service.serving.PaLmConfig`.
* Added `expiration_time` field for
`databricks.sdk.service.sharing.CreateRecipient`.
* Added `next_page_token` field for
`databricks.sdk.service.sharing.GetRecipientSharePermissionsResponse`.
* Added `next_page_token` field for
`databricks.sdk.service.sharing.ListProviderSharesResponse`.
* Added `max_results` and `page_token` fields for
`databricks.sdk.service.sharing.ListProvidersRequest`.
* Added `next_page_token` field for
`databricks.sdk.service.sharing.ListProvidersResponse`.
* Added `max_results` and `page_token` fields for
`databricks.sdk.service.sharing.ListRecipientsRequest`.
* Added `next_page_token` field for
`databricks.sdk.service.sharing.ListRecipientsResponse`.
* Added `max_results` and `page_token` fields for
`databricks.sdk.service.sharing.ListSharesRequest`.
* Added `next_page_token` field for
`databricks.sdk.service.sharing.ListSharesResponse`.
* Added `max_results` and `page_token` fields for
`databricks.sdk.service.sharing.SharePermissionsRequest`.
* Added `expiration_time` field for
`databricks.sdk.service.sharing.UpdateRecipient`.
* Added `max_results` and `page_token` fields for
`databricks.sdk.service.sharing.UpdateSharePermissions`.
* Added `condition`, `create_time`, `custom_body`, `custom_subject`,
`display_name`, `lifecycle_state`, `owner_user_name`, `parent_path`,
`query_id`, `seconds_to_retrigger`, `trigger_time` and `update_time`
fields for `databricks.sdk.service.sql.Alert`.
 * Added `id` field for `databricks.sdk.service.sql.GetAlertRequest`.
 * Added `id` field for `databricks.sdk.service.sql.GetQueryRequest`.
* Added `page_token` field for
`databricks.sdk.service.sql.ListQueriesRequest`.
* Added `apply_auto_limit`, `catalog`, `create_time`, `display_name`,
`last_modifier_user_name`, `lifecycle_state`, `owner_user_name`,
`parameters`, `parent_path`, `query_text`, `run_as_mode`, `schema`,
`update_time` and `warehouse_id` fields for
`databricks.sdk.service.sql.Query`.
* Added `context_filter` field for
`databricks.sdk.service.sql.QueryFilter`.
* Added `query_source` field for `databricks.sdk.service.sql.QueryInfo`.
* Added `create_time`, `display_name`, `query_id`, `serialized_options`,
`serialized_query_plan` and `update_time` fields for
`databricks.sdk.service.sql.Visualization`.
* Changed `create()` method for
[a.budgets](https://databricks-sdk-py.readthedocs.io/en/latest/account/budgets.html)
account-level service to return
`databricks.sdk.service.billing.CreateBudgetConfigurationResponse`
dataclass.
* Changed `create()` method for
[a.budgets](https://databricks-sdk-py.readthedocs.io/en/latest/account/budgets.html)
account-level service . New request type is
`databricks.sdk.service.billing.CreateBudgetConfigurationRequest`
dataclass.
* Changed `delete()` method for
[a.budgets](https://databricks-sdk-py.readthedocs.io/en/latest/account/budgets.html)
account-level service . New request type is
`databricks.sdk.service.billing.DeleteBudgetConfigurationRequest`
dataclass.
* Changed `delete()` method for
[a.budgets](https://databricks-sdk-py.readthedocs.io/en/latest/account/budgets.html)
account-level service to return `any` dataclass.
* Changed `get()` method for
[a.budgets](https://databricks-sdk-py.readthedocs.io/en/latest/account/budgets.html)
account-level service . New request type is
`databricks.sdk.service.billing.GetBudgetConfigurationRequest`
dataclass.
* Changed `get()` method for
[a.budgets](https://databricks-sdk-py.readthedocs.io/en/latest/account/budgets.html)
account-level service to return
`databricks.sdk.service.billing.GetBudgetConfigurationResponse`
dataclass.
* Changed `list()` method for
[a.budgets](https://databricks-sdk-py.readthedocs.io/en/latest/account/budgets.html)
account-level service to return
`databricks.sdk.service.billing.ListBudgetConfigurationsResponse`
dataclass.
* Changed `list()` method for
[a.budgets](https://databricks-sdk-py.readthedocs.io/en/latest/account/budgets.html)
account-level service to require request of
`databricks.sdk.service.billing.ListBudgetConfigurationsRequest`
dataclass.
* Changed `update()` method for
[a.budgets](https://databricks-sdk-py.readthedocs.io/en/latest/account/budgets.html)
account-level service to return
`databricks.sdk.service.billing.UpdateBudgetConfigurationResponse`
dataclass.
* Changed `update()` method for
[a.budgets](https://databricks-sdk-py.readthedocs.io/en/latest/account/budgets.html)
account-level service . New request type is
`databricks.sdk.service.billing.UpdateBudgetConfigurationRequest`
dataclass.
* Changed `create()` method for
[a.custom_app_integration](https://databricks-sdk-py.readthedocs.io/en/latest/account/custom_app_integration.html)
account-level service with new required argument order.
* Changed `list()` method for
[a.custom_app_integration](https://databricks-sdk-py.readthedocs.io/en/latest/account/custom_app_integration.html)
account-level service to require request of
`databricks.sdk.service.oauth2.ListCustomAppIntegrationsRequest`
dataclass.
* Changed `list()` method for
[a.published_app_integration](https://databricks-sdk-py.readthedocs.io/en/latest/account/published_app_integration.html)
account-level service to require request of
`databricks.sdk.service.oauth2.ListPublishedAppIntegrationsRequest`
dataclass.
* Changed `delete()` method for
[a.workspace_assignment](https://databricks-sdk-py.readthedocs.io/en/latest/account/workspace_assignment.html)
account-level service to return `any` dataclass.
* Changed `update()` method for
[a.workspace_assignment](https://databricks-sdk-py.readthedocs.io/en/latest/account/workspace_assignment.html)
account-level service with new required argument order.
* Changed `create()` method for
[w.alerts](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/alerts.html)
workspace-level service . New request type is
`databricks.sdk.service.sql.CreateAlertRequest` dataclass.
* Changed `delete()` method for
[w.alerts](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/alerts.html)
workspace-level service to return `any` dataclass.
* Changed `delete()` method for
[w.alerts](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/alerts.html)
workspace-level service . New request type is
`databricks.sdk.service.sql.TrashAlertRequest` dataclass.
* Changed `get()` method for
[w.alerts](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/alerts.html)
workspace-level service with new required argument order.
* Changed `list()` method for
[w.alerts](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/alerts.html)
workspace-level service to return
`databricks.sdk.service.sql.ListAlertsResponse` dataclass.
* Changed `list()` method for
[w.alerts](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/alerts.html)
workspace-level service to require request of
`databricks.sdk.service.sql.ListAlertsRequest` dataclass.
* Changed `update()` method for
[w.alerts](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/alerts.html)
workspace-level service to return `databricks.sdk.service.sql.Alert`
dataclass.
* Changed `update()` method for
[w.alerts](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/alerts.html)
workspace-level service . New request type is
`databricks.sdk.service.sql.UpdateAlertRequest` dataclass.
* Changed `create()` and `edit()` methods for
[w.cluster_policies](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/cluster_policies.html)
workspace-level service with new required argument order.
* Changed `get()` method for
[w.model_versions](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/model_versions.html)
workspace-level service to return
`databricks.sdk.service.catalog.ModelVersionInfo` dataclass.
* Changed `migrate_permissions()` method for
[w.permission_migration](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/permission_migration.html)
workspace-level service . New request type is
`databricks.sdk.service.iam.MigratePermissionsRequest` dataclass.
* Changed `migrate_permissions()` method for
[w.permission_migration](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/permission_migration.html)
workspace-level service to return
`databricks.sdk.service.iam.MigratePermissionsResponse` dataclass.
* Changed `create()` method for
[w.queries](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/queries.html)
workspace-level service . New request type is
`databricks.sdk.service.sql.CreateQueryRequest` dataclass.
* Changed `delete()` method for
[w.queries](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/queries.html)
workspace-level service to return `any` dataclass.
* Changed `delete()` method for
[w.queries](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/queries.html)
workspace-level service . New request type is
`databricks.sdk.service.sql.TrashQueryRequest` dataclass.
* Changed `get()` method for
[w.queries](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/queries.html)
workspace-level service with new required argument order.
* Changed `list()` method for
[w.queries](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/queries.html)
workspace-level service to return
`databricks.sdk.service.sql.ListQueryObjectsResponse` dataclass.
* Changed `update()` method for
[w.queries](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/queries.html)
workspace-level service . New request type is
`databricks.sdk.service.sql.UpdateQueryRequest` dataclass.
* Changed `create()` method for
[w.query_visualizations](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/query_visualizations.html)
workspace-level service . New request type is
`databricks.sdk.service.sql.CreateVisualizationRequest` dataclass.
* Changed `delete()` method for
[w.query_visualizations](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/query_visualizations.html)
workspace-level service to return `any` dataclass.
* Changed `delete()` method for
[w.query_visualizations](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/query_visualizations.html)
workspace-level service . New request type is
`databricks.sdk.service.sql.DeleteVisualizationRequest` dataclass.
* Changed `update()` method for
[w.query_visualizations](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/query_visualizations.html)
workspace-level service . New request type is
`databricks.sdk.service.sql.UpdateVisualizationRequest` dataclass.
* Changed `list()` method for
[w.shares](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/shares.html)
workspace-level service to require request of
`databricks.sdk.service.sharing.ListSharesRequest` dataclass.
* Changed `execute_statement()` and `get_statement()` methods for
[w.statement_execution](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/statement_execution.html)
workspace-level service to return
`databricks.sdk.service.sql.StatementResponse` dataclass.
* Changed `securable_type` field for
`databricks.sdk.service.catalog.GetBindingsRequest` to
`databricks.sdk.service.catalog.GetBindingsSecurableType` dataclass.
* Changed `securable_type` field for
`databricks.sdk.service.catalog.UpdateWorkspaceBindingsParameters` to
`databricks.sdk.service.catalog.UpdateBindingsSecurableType` dataclass.
* Changed `name` field for `databricks.sdk.service.compute.CreatePolicy`
to no longer be required.
* Changed `name` field for `databricks.sdk.service.compute.EditPolicy`
to no longer be required.
* Changed `policy_family_id` field for
`databricks.sdk.service.compute.GetPolicyFamilyRequest` to `str`
dataclass.
* Changed `policy_families` field for
`databricks.sdk.service.compute.ListPolicyFamiliesResponse` to no longer
be required.
* Changed `definition`, `description`, `name` and `policy_family_id`
fields for `databricks.sdk.service.compute.PolicyFamily` to no longer be
required.
* Changed `permissions` field for
`databricks.sdk.service.iam.UpdateWorkspaceAssignments` to no longer be
required.
* Changed `access_control_list` field for
`databricks.sdk.service.jobs.CreateJob` to
`databricks.sdk.service.jobs.JobAccessControlRequestList` dataclass.
* Changed `access_control_list` field for
`databricks.sdk.service.jobs.SubmitRun` to
`databricks.sdk.service.jobs.JobAccessControlRequestList` dataclass.
* Changed `name` and `redirect_urls` fields for
`databricks.sdk.service.oauth2.CreateCustomAppIntegration` to no longer
be required.
* Changed `ingestion_definition` field for
`databricks.sdk.service.pipelines.CreatePipeline` to
`databricks.sdk.service.pipelines.IngestionPipelineDefinition`
dataclass.
* Changed `ingestion_definition` field for
`databricks.sdk.service.pipelines.EditPipeline` to
`databricks.sdk.service.pipelines.IngestionPipelineDefinition`
dataclass.
* Changed `ingestion_definition` field for
`databricks.sdk.service.pipelines.PipelineSpec` to
`databricks.sdk.service.pipelines.IngestionPipelineDefinition`
dataclass.
* Changed `ai21labs_api_key` field for
`databricks.sdk.service.serving.Ai21LabsConfig` to no longer be
required.
* Changed `aws_access_key_id` and `aws_secret_access_key` fields for
`databricks.sdk.service.serving.AmazonBedrockConfig` to no longer be
required.
* Changed `anthropic_api_key` field for
`databricks.sdk.service.serving.AnthropicConfig` to no longer be
required.
* Changed `cohere_api_key` field for
`databricks.sdk.service.serving.CohereConfig` to no longer be required.
* Changed `databricks_api_token` field for
`databricks.sdk.service.serving.DatabricksModelServingConfig` to no
longer be required.
* Changed `palm_api_key` field for
`databricks.sdk.service.serving.PaLmConfig` to no longer be required.
* Changed `tags` field for `databricks.sdk.service.sql.Query` to
`databricks.sdk.service.sql.List` dataclass.
* Changed `user_ids` and `warehouse_ids` fields for
`databricks.sdk.service.sql.QueryFilter` to
`databricks.sdk.service.sql.List` dataclass.
* Changed `results` field for `databricks.sdk.service.sql.QueryList` to
`databricks.sdk.service.sql.LegacyQueryList` dataclass.
* Changed `visualization` field for `databricks.sdk.service.sql.Widget`
to `databricks.sdk.service.sql.LegacyVisualization` dataclass.
* Removed
[w.apps](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/apps.html)
workspace-level service.
* Removed `restore()` method for
[w.queries](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/queries.html)
workspace-level service.
* Removed `databricks.sdk.service.marketplace.FilterType`,
`databricks.sdk.service.marketplace.ProviderIconFile`,
`databricks.sdk.service.marketplace.ProviderIconType`,
`databricks.sdk.service.marketplace.ProviderListingSummaryInfo`,
`databricks.sdk.service.marketplace.SortBy` and
`databricks.sdk.service.marketplace.VisibilityFilter` dataclasses.
* Removed `databricks.sdk.service.billing.Budget`,
`databricks.sdk.service.billing.BudgetAlert`,
`databricks.sdk.service.billing.BudgetList`,
`databricks.sdk.service.billing.BudgetWithStatus`,
`databricks.sdk.service.billing.BudgetWithStatusStatusDailyItem`,
`databricks.sdk.service.billing.DeleteBudgetRequest`, `any`,
`databricks.sdk.service.billing.GetBudgetRequest`, `any`,
`databricks.sdk.service.billing.WrappedBudget` and
`databricks.sdk.service.billing.WrappedBudgetWithStatus` dataclasses.
* Removed `any`, `databricks.sdk.service.iam.PermissionMigrationRequest`
and `databricks.sdk.service.iam.PermissionMigrationResponse`
dataclasses.
* Removed
`databricks.sdk.service.pipelines.ManagedIngestionPipelineDefinition`
dataclass.
* Removed `databricks.sdk.service.serving.App`,
`databricks.sdk.service.serving.AppDeployment`,
`databricks.sdk.service.serving.AppDeploymentArtifacts`,
`databricks.sdk.service.serving.AppDeploymentMode`,
`databricks.sdk.service.serving.AppDeploymentState`,
`databricks.sdk.service.serving.AppDeploymentStatus`,
`databricks.sdk.service.serving.AppEnvironment`,
`databricks.sdk.service.serving.AppState`,
`databricks.sdk.service.serving.AppStatus`,
`databricks.sdk.service.serving.CreateAppDeploymentRequest`,
`databricks.sdk.service.serving.CreateAppRequest`,
`databricks.sdk.service.serving.DeleteAppRequest`,
`databricks.sdk.service.serving.EnvVariable`,
`databricks.sdk.service.serving.GetAppDeploymentRequest`,
`databricks.sdk.service.serving.GetAppEnvironmentRequest`,
`databricks.sdk.service.serving.GetAppRequest`,
`databricks.sdk.service.serving.ListAppDeploymentsRequest`,
`databricks.sdk.service.serving.ListAppDeploymentsResponse`,
`databricks.sdk.service.serving.ListAppsRequest`,
`databricks.sdk.service.serving.ListAppsResponse`,
`databricks.sdk.service.serving.StartAppRequest`,
`databricks.sdk.service.serving.StopAppRequest`, `any` and
`databricks.sdk.service.serving.UpdateAppRequest` dataclasses.
* Removed `databricks.sdk.service.sql.CreateQueryVisualizationRequest`,
`databricks.sdk.service.sql.DeleteAlertRequest`,
`databricks.sdk.service.sql.DeleteQueryRequest`,
`databricks.sdk.service.sql.DeleteQueryVisualizationRequest`,
`databricks.sdk.service.sql.ExecuteStatementResponse`,
`databricks.sdk.service.sql.GetStatementResponse`,
`databricks.sdk.service.sql.RestoreQueryRequest`,
`databricks.sdk.service.sql.StatementId`,
`databricks.sdk.service.sql.UserId` and
`databricks.sdk.service.sql.WarehouseId` dataclasses.
 * Removed `databricks.sdk.service.compute.PolicyFamilyId` dataclass.
* Removed `can_use_client` field for
`databricks.sdk.service.compute.ListClustersRequest`.
* Removed `is_ascending` and `sort_by` fields for
`databricks.sdk.service.marketplace.ListListingsRequest`.
* Removed `provider_summary` field for
`databricks.sdk.service.marketplace.Listing`.
* Removed `filters` field for
`databricks.sdk.service.marketplace.ListingSetting`.
* Removed `metastore_id` field for
`databricks.sdk.service.marketplace.ListingSummary`.
* Removed `is_ascending` and `sort_by` fields for
`databricks.sdk.service.marketplace.SearchListingsRequest`.
* Removed `created_at`, `last_triggered_at`, `name`, `options`,
`parent`, `query`, `rearm`, `updated_at` and `user` fields for
`databricks.sdk.service.sql.Alert`.
* Removed `alert_id` field for
`databricks.sdk.service.sql.GetAlertRequest`.
* Removed `query_id` field for
`databricks.sdk.service.sql.GetQueryRequest`.
* Removed `order`, `page` and `q` fields for
`databricks.sdk.service.sql.ListQueriesRequest`.
* Removed `include_metrics` field for
`databricks.sdk.service.sql.ListQueryHistoryRequest`.
* Removed `can_edit`, `created_at`, `data_source_id`, `is_archived`,
`is_draft`, `is_favorite`, `is_safe`, `last_modified_by`,
`last_modified_by_id`, `latest_query_data_id`, `name`, `options`,
`parent`, `permission_tier`, `query`, `query_hash`, `run_as_role`,
`updated_at`, `user`, `user_id` and `visualizations` fields for
`databricks.sdk.service.sql.Query`.
* Removed `statement_ids` field for
`databricks.sdk.service.sql.QueryFilter`.
* Removed `can_subscribe_to_live_query` field for
`databricks.sdk.service.sql.QueryInfo`.
* Removed `metadata_time_ms`, `planning_time_ms` and
`query_execution_time_ms` fields for
`databricks.sdk.service.sql.QueryMetrics`.
* Removed `created_at`, `description`, `name`, `options`, `query` and
`updated_at` fields for `databricks.sdk.service.sql.Visualization`.

OpenAPI SHA: f98c07f9c71f579de65d2587bb0292f83d10e55d, Date: 2024-08-12
## Changes

This PR makes sure that single quotes are properly escaped when passing
regex pattern to match errors.

## Tests

Verified that SDK can properly be generated when the pattern contains
single quotes.

Note that `downstreams / compatibility (ucx, databrickslabs)` was
already failing and that this PR should not affect downstream consumers.

- [x] `make test` run locally
- [x] `make fmt` applied
- [x] relevant integration tests applied
…alid semantic version: 0.33.1+420240816190912` (databricks#729)

## Changes

This PR fixes SemVer regex to follow the official recommendation to
capture more patterns. It also ensures that patterns are both SemVer and
PEP440 compliant.

## Tests

- [x] `make test` run locally
- [x] `make fmt` applied
- [ ] relevant integration tests applied
### Bug Fixes

* Fixed regression introduced in v0.30.0 causing `ValueError: Invalid
semantic version: 0.33.1+420240816190912`
([databricks#729](databricks#729)).


### Internal Changes

* Escape single quotes in regex matchers
([databricks#727](databricks#727)).


### API Changes:

* Added
[w.policy_compliance_for_clusters](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/policy_compliance_for_clusters.html)
workspace-level service.
* Added
[w.policy_compliance_for_jobs](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/policy_compliance_for_jobs.html)
workspace-level service.
* Added
[w.resource_quotas](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/resource_quotas.html)
workspace-level service.
* Added `databricks.sdk.service.catalog.GetQuotaRequest`,
`databricks.sdk.service.catalog.GetQuotaResponse`,
`databricks.sdk.service.catalog.ListQuotasRequest`,
`databricks.sdk.service.catalog.ListQuotasResponse` and
`databricks.sdk.service.catalog.QuotaInfo` dataclasses.
* Added `databricks.sdk.service.compute.ClusterCompliance`,
`databricks.sdk.service.compute.ClusterSettingsChange`,
`databricks.sdk.service.compute.EnforceClusterComplianceRequest`,
`databricks.sdk.service.compute.EnforceClusterComplianceResponse`,
`databricks.sdk.service.compute.GetClusterComplianceRequest`,
`databricks.sdk.service.compute.GetClusterComplianceResponse`,
`databricks.sdk.service.compute.ListClusterCompliancesRequest` and
`databricks.sdk.service.compute.ListClusterCompliancesResponse`
dataclasses.
* Added
`databricks.sdk.service.jobs.EnforcePolicyComplianceForJobResponseJobClusterSettingsChange`,
`databricks.sdk.service.jobs.EnforcePolicyComplianceRequest`,
`databricks.sdk.service.jobs.EnforcePolicyComplianceResponse`,
`databricks.sdk.service.jobs.GetPolicyComplianceRequest`,
`databricks.sdk.service.jobs.GetPolicyComplianceResponse`,
`databricks.sdk.service.jobs.JobCompliance`,
`databricks.sdk.service.jobs.ListJobComplianceForPolicyResponse` and
`databricks.sdk.service.jobs.ListJobComplianceRequest` dataclasses.
* Added `fallback` field for
`databricks.sdk.service.catalog.CreateExternalLocation`.
* Added `fallback` field for
`databricks.sdk.service.catalog.ExternalLocationInfo`.
* Added `fallback` field for
`databricks.sdk.service.catalog.UpdateExternalLocation`.
 * Added `job_run_id` field for `databricks.sdk.service.jobs.BaseRun`.
 * Added `job_run_id` field for `databricks.sdk.service.jobs.Run`.
* Added `include_metrics` field for
`databricks.sdk.service.sql.ListQueryHistoryRequest`.
* Added `statement_ids` field for
`databricks.sdk.service.sql.QueryFilter`.
 * Removed `databricks.sdk.service.sql.ContextFilter` dataclass.
* Removed `context_filter` field for
`databricks.sdk.service.sql.QueryFilter`.
* Removed `pipeline_id` and `pipeline_update_id` fields for
`databricks.sdk.service.sql.QuerySource`.

OpenAPI SHA: 3eae49b444cac5a0118a3503e5b7ecef7f96527a, Date: 2024-08-21
…bricks#723)

## Changes
<!-- Summary of your changes that are easy to understand -->
`DatabricksCliTokenSource().token()` itself can't be copied. So, Deep
Copy can't be performed for Config.
Added the wrapper function which can be copied. So, Deep copy can be
performed.
## Tests
<!-- 
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [ ] `make test` run locally
- [ ] `make fmt` applied
- [ ] relevant integration tests applied
… does actually work through integration tests (databricks#736)

Signed-off-by: Serge Smertin <[email protected]>
…tabricks#738)

## Changes
The current get_workspace_client test fails because the SP used by the
test does not have access to the first workspace listed. In the
[Go](https://github.com/databricks/databricks-sdk-go/blob/main/internal/account_client_test.go#L12)
&
[Java](https://github.com/databricks/databricks-sdk-java/blob/1b90e2318f8221ac0a6e4b56c9b0e4c286e38c9f/databricks-sdk-java/src/test/java/com/databricks/sdk/integration/AccountClientIT.java#L17)
SDKs, the corresponding test respects the `TEST_WORKSPACE_ID`
environment variable to know which workspace to attempt to login to.
This PR changes the test to use that environment variable as well.

## Tests
<!-- 
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [ ] `make test` run locally
- [ ] `make fmt` applied
- [ ] relevant integration tests applied
### Bug Fixes

* Fix `DatabricksConfig.copy` when authenticated with OAuth
([databricks#723](databricks#723)).


### Internal Changes

* Fix get_workspace_client test to match Go SDK behavior
([databricks#738](databricks#738)).
* Verify that `WorkspaceClient` created from `AccountClient` does
actually work through integration tests
([databricks#736](databricks#736)).
## Changes
Add Data Plane access documentation

## Tests
<!-- 
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [ ] `make test` run locally
- [ ] `make fmt` applied
- [ ] relevant integration tests applied
@gorskysd gorskysd self-requested a review February 18, 2025 20:57
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.