Skip to content

Commit d810e1b

Browse files
committed
Merge branch 'main' into databricks-wif
2 parents 5063be3 + 0bd31d6 commit d810e1b

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

49 files changed

+1181
-523
lines changed

.codegen/_openapi_sha

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1 +1 @@
1-
cd641c9dd4febe334b339dd7878d099dcf0eeab5
1+
94dc3e7289a19a90b167adf27316bd703a86f0eb

.release_metadata.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,3 @@
11
{
2-
"timestamp": "2025-03-11 15:30:51+0000"
2+
"timestamp": "2025-03-21 07:12:02+0000"
33
}

CHANGELOG.md

Lines changed: 39 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,44 @@
11
# Version changelog
22

3+
## Release v0.47.0
4+
5+
### Bug Fixes
6+
7+
* Ensure that refresh tokens are returned when using the `external-browser` credentials strategy.
8+
9+
### API Changes
10+
* Added `abfss`, `dbfs`, `error_message`, `execution_duration_seconds`, `file`, `gcs`, `s3`, `status`, `volumes` and `workspace` fields for `databricks.sdk.service.compute.InitScriptInfoAndExecutionDetails`.
11+
* [Breaking] Added `forecast_granularity` field for `databricks.sdk.service.ml.CreateForecastingExperimentRequest`.
12+
* Added `jwks_uri` field for `databricks.sdk.service.oauth2.OidcFederationPolicy`.
13+
* Added `fallback_config` field for `databricks.sdk.service.serving.AiGatewayConfig`.
14+
* Added `custom_provider_config` field for `databricks.sdk.service.serving.ExternalModel`.
15+
* Added `fallback_config` field for `databricks.sdk.service.serving.PutAiGatewayRequest`.
16+
* Added `fallback_config` field for `databricks.sdk.service.serving.PutAiGatewayResponse`.
17+
* Added `aliases`, `comment`, `data_type`, `dependency_list`, `full_data_type`, `id`, `input_params`, `name`, `properties`, `routine_definition`, `schema`, `securable_kind`, `share`, `share_id`, `storage_location` and `tags` fields for `databricks.sdk.service.sharing.DeltaSharingFunction`.
18+
* Added `access_token_failure`, `allocation_timeout`, `allocation_timeout_node_daemon_not_ready`, `allocation_timeout_no_healthy_clusters`, `allocation_timeout_no_matched_clusters`, `allocation_timeout_no_ready_clusters`, `allocation_timeout_no_unallocated_clusters`, `allocation_timeout_no_warmed_up_clusters`, `aws_inaccessible_kms_key_failure`, `aws_instance_profile_update_failure`, `aws_invalid_key_pair`, `aws_invalid_kms_key_state`, `aws_resource_quota_exceeded`, `azure_packed_deployment_partial_failure`, `bootstrap_timeout_due_to_misconfig`, `budget_policy_limit_enforcement_activated`, `budget_policy_resolution_failure`, `cloud_account_setup_failure`, `cloud_operation_cancelled`, `cloud_provider_instance_not_launched`, `cloud_provider_launch_failure_due_to_misconfig`, `cloud_provider_resource_stockout_due_to_misconfig`, `cluster_operation_throttled`, `cluster_operation_timeout`, `control_plane_request_failure_due_to_misconfig`, `data_access_config_changed`, `disaster_recovery_replication`, `driver_eviction`, `driver_launch_timeout`, `driver_node_unreachable`, `driver_out_of_disk`, `driver_out_of_memory`, `driver_pod_creation_failure`, `driver_unexpected_failure`, `dynamic_spark_conf_size_exceeded`, `eos_spark_image`, `executor_pod_unscheduled`, `gcp_api_rate_quota_exceeded`, `gcp_forbidden`, `gcp_iam_timeout`, `gcp_inaccessible_kms_key_failure`, `gcp_insufficient_capacity`, `gcp_ip_space_exhausted`, `gcp_kms_key_permission_denied`, `gcp_not_found`, `gcp_resource_quota_exceeded`, `gcp_service_account_access_denied`, `gcp_service_account_not_found`, `gcp_subnet_not_ready`, `gcp_trusted_image_projects_violated`, `gke_based_cluster_termination`, `init_container_not_finished`, `instance_pool_max_capacity_reached`, `instance_pool_not_found`, `instance_unreachable_due_to_misconfig`, `internal_capacity_failure`, `invalid_aws_parameter`, `invalid_instance_placement_protocol`, `invalid_worker_image_failure`, `in_penalty_box`, `lazy_allocation_timeout`, `maintenance_mode`, `netvisor_setup_timeout`, `no_matched_k8s`, `no_matched_k8s_testing_tag`, `pod_assignment_failure`, `pod_scheduling_failure`, `resource_usage_blocked`, `secret_creation_failure`, `serverless_long_running_terminated`, `spark_image_download_throttled`, `spark_image_not_found`, `ssh_bootstrap_failure`, `storage_download_failure_due_to_misconfig`, `storage_download_failure_slow`, `storage_download_failure_throttled`, `unexpected_pod_recreation`, `user_initiated_vm_termination` and `workspace_update` enum values for `databricks.sdk.service.compute.TerminationReasonCode`.
19+
* Added `generated_sql_query_too_long_exception` and `missing_sql_query_exception` enum values for `databricks.sdk.service.dashboards.MessageErrorType`.
20+
* Added `balanced` enum value for `databricks.sdk.service.jobs.PerformanceTarget`.
21+
* Added `listing_resource` enum value for `databricks.sdk.service.marketplace.FileParentType`.
22+
* Added `app` enum value for `databricks.sdk.service.marketplace.MarketplaceFileType`.
23+
* Added `custom` enum value for `databricks.sdk.service.serving.ExternalModelProvider`.
24+
* [Breaking] Changed `create_experiment()` method for [w.forecasting](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/ml/forecasting.html) workspace-level service with new required argument order.
25+
* Changed `instance_type_id` field for `databricks.sdk.service.compute.NodeInstanceType` to be required.
26+
* Changed `category` field for `databricks.sdk.service.compute.NodeType` to be required.
27+
* [Breaking] Changed `functions` field for `databricks.sdk.service.sharing.ListProviderShareAssetsResponse` to type `databricks.sdk.service.sharing.DeltaSharingFunctionList` dataclass.
28+
* [Breaking] Changed waiter for [ClustersAPI.create](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/compute/clusters.html#databricks.sdk.service.compute.ClustersAPI.create) method.
29+
* [Breaking] Changed waiter for [ClustersAPI.delete](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/compute/clusters.html#databricks.sdk.service.compute.ClustersAPI.delete) method.
30+
* [Breaking] Changed waiter for [ClustersAPI.edit](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/compute/clusters.html#databricks.sdk.service.compute.ClustersAPI.edit) method.
31+
* [Breaking] Changed waiter for [ClustersAPI.get](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/compute/clusters.html#databricks.sdk.service.compute.ClustersAPI.get) method.
32+
* [Breaking] Changed waiter for [ClustersAPI.resize](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/compute/clusters.html#databricks.sdk.service.compute.ClustersAPI.resize) method.
33+
* [Breaking] Changed waiter for [ClustersAPI.restart](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/compute/clusters.html#databricks.sdk.service.compute.ClustersAPI.restart) method.
34+
* [Breaking] Changed waiter for [ClustersAPI.start](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/compute/clusters.html#databricks.sdk.service.compute.ClustersAPI.start) method.
35+
* [Breaking] Changed waiter for [ClustersAPI.update](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/compute/clusters.html#databricks.sdk.service.compute.ClustersAPI.update) method.
36+
* [Breaking] Removed `execution_details` and `script` fields for `databricks.sdk.service.compute.InitScriptInfoAndExecutionDetails`.
37+
* [Breaking] Removed `supports_elastic_disk` field for `databricks.sdk.service.compute.NodeType`.
38+
* [Breaking] Removed `data_granularity_quantity` and `data_granularity_unit` fields for `databricks.sdk.service.ml.CreateForecastingExperimentRequest`.
39+
* [Breaking] Removed `aliases`, `comment`, `data_type`, `dependency_list`, `full_data_type`, `id`, `input_params`, `name`, `properties`, `routine_definition`, `schema`, `securable_kind`, `share`, `share_id`, `storage_location` and `tags` fields for `databricks.sdk.service.sharing.Function`.
40+
41+
342
## Release v0.46.0
443

544
### New Features and Improvements

Makefile

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -28,7 +28,7 @@ test:
2828
pytest -m 'not integration and not benchmark' --cov=databricks --cov-report html tests
2929

3030
integration:
31-
pytest -n auto --dist loadgroup --cov=databricks --cov-report html tests/integration/test_auth.py
31+
pytest -n auto -m 'integration and not benchmark' --reruns 2 --dist loadgroup --cov=databricks --cov-report html tests
3232

3333
benchmark:
3434
pytest -m 'benchmark' tests

NEXT_CHANGELOG.md

Lines changed: 1 addition & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
# NEXT CHANGELOG
22

3-
## Release v0.47.0
3+
## Release v0.48.0
44

55
### New Features and Improvements
66
* Introduce support for Databricks Workload Identity Federation in GitHub workflows ([933](https://github.com/databricks/databricks-sdk-py/pull/933)).
@@ -10,8 +10,6 @@
1010

1111
### Bug Fixes
1212

13-
* Ensure that refresh tokens are returned when using the `external-browser` credentials strategy.
14-
1513
### Documentation
1614

1715
### Internal Changes

databricks/sdk/credentials_provider.py

Lines changed: 10 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -325,8 +325,15 @@ def databricks_wif(cfg: "Config") -> Optional[CredentialsProvider]:
325325
- GitHub OIDC
326326
"""
327327
supplier = GitHubOIDCTokenSupplier()
328+
329+
audience = cfg.token_audience
330+
if audience is None and cfg.is_account_client:
331+
audience = cfg.account_id
332+
if audience is None and not cfg.is_account_client:
333+
audience = cfg.oidc_endpoints.token_endpoint
334+
328335
# Try to get an idToken. If no supplier returns a token, we cannot use this authentication mode.
329-
idToken = supplier.get_oidc_token(cfg.token_audience)
336+
idToken = supplier.get_oidc_token(audience)
330337
if not idToken:
331338
return None
332339

@@ -351,11 +358,11 @@ def token_source_for(audience: str) -> TokenSource:
351358
)
352359

353360
def refreshed_headers() -> Dict[str, str]:
354-
token = token_source_for(cfg.token_audience).token()
361+
token = token_source_for(audience).token()
355362
return {"Authorization": f"{token.token_type} {token.access_token}"}
356363

357364
def token() -> Token:
358-
return token_source_for(cfg.token_audience).token()
365+
return token_source_for(audience).token()
359366

360367
return OAuthCredentialsProvider(refreshed_headers, token)
361368

databricks/sdk/oidc_token_supplier.py

Lines changed: 0 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -26,8 +26,4 @@ def get_oidc_token(self, audience: str) -> Optional[str]:
2626
if "value" not in response_json:
2727
return None
2828

29-
# GitHub issued time is not allways in sync, and can give tokens which are not yet valid.
30-
# TODO: Remove this after Databricks API is updated to handle such cases.
31-
sleep(2)
32-
3329
return response_json["value"]

databricks/sdk/service/catalog.py

Lines changed: 2 additions & 0 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

0 commit comments

Comments
 (0)