Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .codegen/_openapi_sha
Original file line number Diff line number Diff line change
@@ -1 +1 @@
69902d1abe35bd9e78e0231927bf14d11b383a16
129063c55cb0cf4bda0d561f0bdb7e77d00b9df6
2 changes: 1 addition & 1 deletion .gitattributes
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
databricks/sdk/__init__.py linguist-generated=true
databricks/sdk/errors/overrides.py linguist-generated=true
databricks/sdk/errors/platform.py linguist-generated=true
databricks/sdk/service/aibuilder.py linguist-generated=true
databricks/sdk/service/agentbricks.py linguist-generated=true
databricks/sdk/service/apps.py linguist-generated=true
databricks/sdk/service/billing.py linguist-generated=true
databricks/sdk/service/catalog.py linguist-generated=true
Expand Down
17 changes: 17 additions & 0 deletions NEXT_CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,3 +18,20 @@
- Refactor unit tests for `FilesExt` to improve its readability.

### API Changes
* Added `databricks.sdk.service.agentbricks` package.
* Added `provisioning_phase` field for `databricks.sdk.service.database.SyncedTablePipelineProgress`.
* Added `redshift` and `sqldw` enum values for `databricks.sdk.service.pipelines.IngestionSourceType`.
* Added `germany_c5` enum value for `databricks.sdk.service.settings.ComplianceStandard`.
* Changed `asset_type` and `name` fields for `databricks.sdk.service.cleanrooms.CleanRoomAsset` to be required.
* [Breaking] Changed `asset_type` and `name` fields for `databricks.sdk.service.cleanrooms.CleanRoomAsset` to be required.
* [Breaking] Changed `local_name` field for `databricks.sdk.service.cleanrooms.CleanRoomAssetForeignTableLocalDetails` to be required.
* Changed `local_name` field for `databricks.sdk.service.cleanrooms.CleanRoomAssetForeignTableLocalDetails` to be required.
* Changed `notebook_content` field for `databricks.sdk.service.cleanrooms.CleanRoomAssetNotebook` to be required.
* [Breaking] Changed `notebook_content` field for `databricks.sdk.service.cleanrooms.CleanRoomAssetNotebook` to be required.
* Changed `local_name` field for `databricks.sdk.service.cleanrooms.CleanRoomAssetTableLocalDetails` to be required.
* [Breaking] Changed `local_name` field for `databricks.sdk.service.cleanrooms.CleanRoomAssetTableLocalDetails` to be required.
* [Breaking] Changed `local_name` field for `databricks.sdk.service.cleanrooms.CleanRoomAssetViewLocalDetails` to be required.
* Changed `local_name` field for `databricks.sdk.service.cleanrooms.CleanRoomAssetViewLocalDetails` to be required.
* [Breaking] Changed `local_name` field for `databricks.sdk.service.cleanrooms.CleanRoomAssetVolumeLocalDetails` to be required.
* Changed `local_name` field for `databricks.sdk.service.cleanrooms.CleanRoomAssetVolumeLocalDetails` to be required.
* [Breaking] Removed `databricks.sdk.service.aibuilder` package.
10 changes: 5 additions & 5 deletions databricks/sdk/__init__.py

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

2 changes: 1 addition & 1 deletion databricks/sdk/oidc.py
Original file line number Diff line number Diff line change
Expand Up @@ -195,7 +195,7 @@ def token(self) -> oauth.Token:
def _exchange_id_token(self, id_token: IdToken) -> oauth.Token:
client = oauth.ClientCredentials(
client_id=self._client_id,
client_secret="",
client_secret="", # there is no (rotatable) secrets in the OIDC flow
token_url=self._token_endpoint,
endpoint_params={
"subject_token_type": "urn:ietf:params:oauth:token-type:jwt",
Expand Down

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

1 change: 1 addition & 0 deletions databricks/sdk/service/catalog.py

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

48 changes: 24 additions & 24 deletions databricks/sdk/service/cleanrooms.py

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

17 changes: 16 additions & 1 deletion databricks/sdk/service/database.py

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

2 changes: 2 additions & 0 deletions databricks/sdk/service/pipelines.py

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

1 change: 1 addition & 0 deletions databricks/sdk/service/settings.py

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

3 changes: 2 additions & 1 deletion databricks/sdk/service/sharing.py

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

5 changes: 4 additions & 1 deletion docs/account/iam/service_principals.rst
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,10 @@

a = AccountClient()

spn = a.service_principals.create(display_name=f"sdk-{time.time_ns()}")
sp_create = a.service_principals.create(active=True, display_name=f"sdk-{time.time_ns()}")

# cleanup
a.service_principals.delete(id=sp_create.id)

Creates a new service principal in the Databricks account.

Expand Down
4 changes: 2 additions & 2 deletions docs/account/iam/workspace_assignment.rst
Original file line number Diff line number Diff line change
Expand Up @@ -74,9 +74,9 @@

spn_id = spn.id

workspace_id = os.environ["TEST_WORKSPACE_ID"]
workspace_id = os.environ["DUMMY_WORKSPACE_ID"]

a.workspace_assignment.update(
_ = a.workspace_assignment.update(
workspace_id=workspace_id,
principal_id=spn_id,
permissions=[iam.WorkspacePermission.USER],
Expand Down
6 changes: 3 additions & 3 deletions docs/account/provisioning/credentials.rst
Original file line number Diff line number Diff line change
Expand Up @@ -24,15 +24,15 @@

a = AccountClient()

creds = a.credentials.create(
role = a.credentials.create(
credentials_name=f"sdk-{time.time_ns()}",
aws_credentials=provisioning.CreateCredentialAwsCredentials(
sts_role=provisioning.CreateCredentialStsRole(role_arn=os.environ["TEST_LOGDELIVERY_ARN"])
sts_role=provisioning.CreateCredentialStsRole(role_arn=os.environ["TEST_CROSSACCOUNT_ARN"])
),
)

# cleanup
a.credentials.delete(credentials_id=creds.credentials_id)
a.credentials.delete(credentials_id=role.credentials_id)

Creates a Databricks credential configuration that represents cloud cross-account credentials for a
specified account. Databricks uses this to set up network infrastructure properly to host Databricks
Expand Down
7 changes: 4 additions & 3 deletions docs/account/provisioning/storage.rst
Original file line number Diff line number Diff line change
Expand Up @@ -16,20 +16,21 @@

.. code-block::

import os
import time

from databricks.sdk import AccountClient
from databricks.sdk.service import provisioning

a = AccountClient()

bucket = a.storage.create(
storage = a.storage.create(
storage_configuration_name=f"sdk-{time.time_ns()}",
root_bucket_info=provisioning.RootBucketInfo(bucket_name=f"sdk-{time.time_ns()}"),
root_bucket_info=provisioning.RootBucketInfo(bucket_name=os.environ["TEST_ROOT_BUCKET"]),
)

# cleanup
a.storage.delete(storage_configuration_id=bucket.storage_configuration_id)
a.storage.delete(storage_configuration_id=storage.storage_configuration_id)

Creates new storage configuration for an account, specified by ID. Uploads a storage configuration
object that represents the root AWS S3 bucket in your account. Databricks stores related workspace
Expand Down
Loading
Loading