Skip to content

Conversation

@tom-s-powell
Copy link
Contributor

@tom-s-powell tom-s-powell commented Apr 4, 2025

What changes are proposed in this pull request?

When requests are retried, there is currently no information available to the caller to understand why. On retries no exception is logged and on final retry, the cause is only available if an IOException were the original source of failure rather. DatabricksError feels like a more useful error to surface as this will capture the error from the server which would have evaluated into a retriable error.

Currently the stacktrace appears as follows, which doesn't provide much information:

com.databricks.sdk.core.DatabricksException: Request GET /api/2.1/unity-catalog/tables?catalog_name=<REDACTED>&schema_name=<REDACTED> failed after 4 retries
\tat com.databricks.sdk.core.ApiClient.executeInner(ApiClient.java:282)
\tat com.databricks.sdk.core.ApiClient.getResponse(ApiClient.java:235)
\tat com.databricks.sdk.core.ApiClient.execute(ApiClient.java:227)
\tat com.databricks.sdk.core.ApiClient.GET(ApiClient.java:148)
\tat com.databricks.sdk.service.catalog.TablesImpl.list(TablesImpl.java:47)
\tat com.databricks.sdk.support.Paginator.flipNextPage(Paginator.java:58)
\tat com.databricks.sdk.support.Paginator.<init>(Paginator.java:51)
\tat com.databricks.sdk.service.catalog.TablesAPI.list(TablesAPI.java:102)
\tat com.databricks.sdk.service.catalog.TablesAPI.list(TablesAPI.java:89)

How is this tested?

Unit tests added.

@renaudhartert-db renaudhartert-db self-requested a review April 7, 2025 16:18
@tom-s-powell tom-s-powell changed the title Catprue DatabricksError when retrying API calls Capture DatabricksError when retrying API calls Apr 9, 2025
@renaudhartert-db
Copy link
Contributor

Hi @tom-s-powell, thanks for the PR! Before we proceed, could you make sure that all your commits are verified? Unverified commits are not allowed in this repository.

@github-actions
Copy link

github-actions bot commented May 9, 2025

If integration tests don't run automatically, an authorized user can run them manually by following the instructions below:

Trigger:
go/deco-tests-run/sdk-java

Inputs:

  • PR number: 427
  • Commit SHA: f3c5b31996686cbbd8ca93cde84c1366d1df355d

Checks will be approved automatically on success.

Copy link
Contributor

@renaudhartert-db renaudhartert-db left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! Thanks for the contribution. I'll take care of running the test and merging it.

@renaudhartert-db renaudhartert-db enabled auto-merge May 9, 2025 11:35
@renaudhartert-db renaudhartert-db added this pull request to the merge queue May 9, 2025
Merged via the queue into databricks:main with commit 14e4c63 May 9, 2025
15 checks passed
deco-sdk-tagging bot added a commit that referenced this pull request May 19, 2025
## Release v0.50.0

### Internal Changes
* Capture DatabricksError when retrying API calls ([#427](#427)).

### API Changes
* Added `accountClient.llmProxyPartnerPoweredAccount()` service, `accountClient.llmProxyPartnerPoweredEnforce()` service and `workspaceClient.llmProxyPartnerPoweredWorkspace()` service.
* Added `workspaceClient.databaseInstances()` service.
* Added `createProvisionedThroughputEndpoint()` and `updateProvisionedThroughputEndpointConfig()` methods for `workspaceClient.servingEndpoints()` service.
* Added `catalogName` field for `com.databricks.sdk.service.catalog.EnableRequest`.
* Added `sourceType` field for `com.databricks.sdk.service.pipelines.IngestionPipelineDefinition`.
* Added `glob` field for `com.databricks.sdk.service.pipelines.PipelineLibrary`.
* Added `provisionedModelUnits` field for `com.databricks.sdk.service.serving.ServedEntityInput`.
* Added `provisionedModelUnits` field for `com.databricks.sdk.service.serving.ServedEntityOutput`.
* Added `provisionedModelUnits` field for `com.databricks.sdk.service.serving.ServedModelInput`.
* Added `provisionedModelUnits` field for `com.databricks.sdk.service.serving.ServedModelOutput`.
* Added `DESCRIBE_QUERY_INVALID_SQL_ERROR`, `DESCRIBE_QUERY_TIMEOUT`, `DESCRIBE_QUERY_UNEXPECTED_FAILURE`, `INVALID_CHAT_COMPLETION_ARGUMENTS_JSON_EXCEPTION`, `INVALID_SQL_MULTIPLE_DATASET_REFERENCES_EXCEPTION`, `INVALID_SQL_MULTIPLE_STATEMENTS_EXCEPTION` and `INVALID_SQL_UNKNOWN_TABLE_EXCEPTION` enum values for `com.databricks.sdk.service.dashboards.MessageErrorType`.
* Added `CAN_CREATE` and `CAN_MONITOR_ONLY` enum values for `com.databricks.sdk.service.iam.PermissionLevel`.
* Added `SUCCESS_WITH_FAILURES` enum value for `com.databricks.sdk.service.jobs.TerminationCodeCode`.
* Added `INFRASTRUCTURE_MAINTENANCE` enum value for `com.databricks.sdk.service.pipelines.StartUpdateCause`.
* Added `INFRASTRUCTURE_MAINTENANCE` enum value for `com.databricks.sdk.service.pipelines.UpdateInfoCause`.
* [Breaking] Changed `createAlert()` and `updateAlert()` methods for `workspaceClient.alertsV2()` service with new required argument order.
* [Breaking] Changed `set()` method for `workspaceClient.permissions()` service . New request type is `com.databricks.sdk.service.iam.SetObjectPermissions` class.
* [Breaking] Changed `update()` method for `workspaceClient.permissions()` service . New request type is `com.databricks.sdk.service.iam.UpdateObjectPermissions` class.
* [Breaking] Changed `get()` method for `workspaceClient.workspaceBindings()` service to return `com.databricks.sdk.service.catalog.GetCatalogWorkspaceBindingsResponse` class.
* [Breaking] Changed `getBindings()` method for `workspaceClient.workspaceBindings()` service to return `com.databricks.sdk.service.catalog.GetWorkspaceBindingsResponse` class.
* [Breaking] Changed `update()` method for `workspaceClient.workspaceBindings()` service to return `com.databricks.sdk.service.catalog.UpdateCatalogWorkspaceBindingsResponse` class.
* [Breaking] Changed `updateBindings()` method for `workspaceClient.workspaceBindings()` service to return `com.databricks.sdk.service.catalog.UpdateWorkspaceBindingsResponse` class.
* [Breaking] Changed `securableType` field for `com.databricks.sdk.service.catalog.GetBindingsRequest` to type `String` class.
* Changed `schema` and `state` fields for `com.databricks.sdk.service.catalog.SystemSchemaInfo` to be required.
* [Breaking] Changed `state` field for `com.databricks.sdk.service.catalog.SystemSchemaInfo` to type `String` class.
* [Breaking] Changed `securableType` field for `com.databricks.sdk.service.catalog.UpdateWorkspaceBindingsParameters` to type `String` class.
* [Breaking] Changed `workspaceId` field for `com.databricks.sdk.service.catalog.WorkspaceBinding` to be required.
* [Breaking] Changed `gpuNodePoolId` field for `com.databricks.sdk.service.jobs.ComputeConfig` to no longer be required.
* Changed `gpuNodePoolId` field for `com.databricks.sdk.service.jobs.ComputeConfig` to no longer be required.
* [Breaking] Changed `alert` field for `com.databricks.sdk.service.sql.CreateAlertV2Request` to be required.
* [Breaking] Changed `alert` field for `com.databricks.sdk.service.sql.UpdateAlertV2Request` to be required.
* [Breaking] Removed `nodeTypeFlexibility` field for `com.databricks.sdk.service.compute.EditInstancePool`.
* [Breaking] Removed `nodeTypeFlexibility` field for `com.databricks.sdk.service.compute.GetInstancePool`.
* [Breaking] Removed `nodeTypeFlexibility` field for `com.databricks.sdk.service.compute.InstancePoolAndStats`.
* [Breaking] Removed `CATALOG`, `CREDENTIAL`, `EXTERNAL_LOCATION` and `STORAGE_CREDENTIAL` enum values for `com.databricks.sdk.service.catalog.GetBindingsSecurableType`.
* [Breaking] Removed `AVAILABLE`, `DISABLE_INITIALIZED`, `ENABLE_COMPLETED`, `ENABLE_INITIALIZED` and `UNAVAILABLE` enum values for `com.databricks.sdk.service.catalog.SystemSchemaInfoState`.
* [Breaking] Removed `CATALOG`, `CREDENTIAL`, `EXTERNAL_LOCATION` and `STORAGE_CREDENTIAL` enum values for `com.databricks.sdk.service.catalog.UpdateBindingsSecurableType`.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants