Skip to content

Commit b15c91a

Browse files
Fix highlighting issue due to multi-segment imports in WorkspaceClient/AccountClient (#979)
## What changes are proposed in this pull request? This PR fixes a reported highlighting problem with the way API clients are imported in `WorkspaceClient/AccountClient`. Specifically, it now imports the API clients using their module as reference instead of a multi-segment reference. See highlighting issue: ![image](https://github.com/user-attachments/assets/98829599-8ec4-4e4f-8a36-3f526bf0f493) This PR regenerates the SDK documentation that wasn't generated in the previous release. ## How is this tested? Unit and integration tests + verified locally that the objects are now properly identified.
1 parent 9e96bac commit b15c91a

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

58 files changed

+1745
-510
lines changed

.codegen.json

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -5,9 +5,9 @@
55
"databricks/sdk/version.py": "__version__ = \"$VERSION\""
66
},
77
"toolchain": {
8-
"required": ["python3"],
8+
"required": ["python3.12"],
99
"pre_setup": [
10-
"python3 -m venv .databricks"
10+
"python3.12 -m venv .databricks"
1111
],
1212
"prepend_path": ".databricks/bin",
1313
"setup": [
@@ -17,7 +17,7 @@
1717
"make fmt",
1818
"pytest -m 'not integration' --cov=databricks --cov-report html tests",
1919
"pip install .",
20-
"python docs/gen-client-docs.py"
20+
"python3.12 docs/gen-client-docs.py"
2121
]
2222
}
2323
}

NEXT_CHANGELOG.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -11,6 +11,9 @@
1111

1212
### Bug Fixes
1313

14+
- Fix a reported highlighting problem with the way API clients are imported in WorkspaceClient/AccountClient
15+
([#979](https://github.com/databricks/databricks-sdk-py/pull/979)).
16+
1417
### Documentation
1518

1619
### Internal Changes

databricks/sdk/__init__.py

Lines changed: 275 additions & 260 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

docs/account/iam/access_control.rst

Lines changed: 15 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -18,6 +18,11 @@
1818
:param resource: str
1919
The resource name for which assignable roles will be listed.
2020

21+
Examples | Summary :--- | :--- `resource=accounts/<ACCOUNT_ID>` | A resource name for the account.
22+
`resource=accounts/<ACCOUNT_ID>/groups/<GROUP_ID>` | A resource name for the group.
23+
`resource=accounts/<ACCOUNT_ID>/servicePrincipals/<SP_ID>` | A resource name for the service
24+
principal.
25+
2126
:returns: :class:`GetAssignableRolesForResourceResponse`
2227

2328

@@ -30,6 +35,12 @@
3035

3136
:param name: str
3237
The ruleset name associated with the request.
38+
39+
Examples | Summary :--- | :--- `name=accounts/<ACCOUNT_ID>/ruleSets/default` | A name for a rule set
40+
on the account. `name=accounts/<ACCOUNT_ID>/groups/<GROUP_ID>/ruleSets/default` | A name for a rule
41+
set on the group.
42+
`name=accounts/<ACCOUNT_ID>/servicePrincipals/<SERVICE_PRINCIPAL_APPLICATION_ID>/ruleSets/default` |
43+
A name for a rule set on the service principal.
3344
:param etag: str
3445
Etag used for versioning. The response is at least as fresh as the eTag provided. Etag is used for
3546
optimistic concurrency control as a way to help prevent simultaneous updates of a rule set from
@@ -38,6 +49,10 @@
3849
etag from a GET rule set request, and pass it with the PUT update request to identify the rule set
3950
version you are updating.
4051

52+
Examples | Summary :--- | :--- `etag=` | An empty etag can only be used in GET to indicate no
53+
freshness requirements. `etag=RENUAAABhSweA4NvVmmUYdiU717H3Tgy0UJdor3gE4a+mq/oj9NjAf8ZsQ==` | An
54+
etag encoded a specific version of the rule set to get or to be updated.
55+
4156
:returns: :class:`RuleSetResponse`
4257

4358

docs/account/iam/workspace_assignment.rst

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -47,9 +47,9 @@
4747
4848
a = AccountClient()
4949
50-
workspace_id = os.environ["TEST_WORKSPACE_ID"]
50+
workspace_id = os.environ["DUMMY_WORKSPACE_ID"]
5151
52-
all = a.workspace_assignment.list(list=workspace_id)
52+
all = a.workspace_assignment.list(workspace_id=workspace_id)
5353
5454
Get permission assignments.
5555

@@ -80,9 +80,9 @@
8080
8181
spn_id = spn.id
8282
83-
workspace_id = os.environ["TEST_WORKSPACE_ID"]
83+
workspace_id = os.environ["DUMMY_WORKSPACE_ID"]
8484
85-
a.workspace_assignment.update(
85+
_ = a.workspace_assignment.update(
8686
workspace_id=workspace_id,
8787
principal_id=spn_id,
8888
permissions=[iam.WorkspacePermission.USER],

docs/account/provisioning/storage.rst

Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,6 +16,7 @@
1616

1717
.. code-block::
1818
19+
import os
1920
import time
2021
2122
from databricks.sdk import AccountClient
@@ -25,8 +26,11 @@
2526
2627
storage = a.storage.create(
2728
storage_configuration_name=f"sdk-{time.time_ns()}",
28-
root_bucket_info=provisioning.RootBucketInfo(bucket_name=f"sdk-{time.time_ns()}"),
29+
root_bucket_info=provisioning.RootBucketInfo(bucket_name=os.environ["TEST_ROOT_BUCKET"]),
2930
)
31+
32+
# cleanup
33+
a.storage.delete(storage_configuration_id=storage.storage_configuration_id)
3034
3135
Create new storage configuration.
3236

docs/account/settings/index.rst

Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,9 +9,13 @@ Manage security settings for Accounts and Workspaces
99

1010
ip_access_lists
1111
network_connectivity
12+
network_policies
1213
settings
1314
csp_enablement_account
1415
disable_legacy_features
1516
enable_ip_access_lists
1617
esm_enablement_account
17-
personal_compute
18+
llm_proxy_partner_powered_account
19+
llm_proxy_partner_powered_enforce
20+
personal_compute
21+
workspace_network_configuration
Lines changed: 46 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,46 @@
1+
``a.settings.llm_proxy_partner_powered_account``: Enable Partner Powered AI Features for Account
2+
================================================================================================
3+
.. currentmodule:: databricks.sdk.service.settings
4+
5+
.. py:class:: LlmProxyPartnerPoweredAccountAPI
6+
7+
Determines if partner powered models are enabled or not for a specific account
8+
9+
.. py:method:: get( [, etag: Optional[str]]) -> LlmProxyPartnerPoweredAccount
10+
11+
Get the enable partner powered AI features account setting.
12+
13+
Gets the enable partner powered AI features account setting.
14+
15+
:param etag: str (optional)
16+
etag used for versioning. The response is at least as fresh as the eTag provided. This is used for
17+
optimistic concurrency control as a way to help prevent simultaneous writes of a setting overwriting
18+
each other. It is strongly suggested that systems make use of the etag in the read -> delete pattern
19+
to perform setting deletions in order to avoid race conditions. That is, get an etag from a GET
20+
request, and pass it with the DELETE request to identify the rule set version you are deleting.
21+
22+
:returns: :class:`LlmProxyPartnerPoweredAccount`
23+
24+
25+
.. py:method:: update(allow_missing: bool, setting: LlmProxyPartnerPoweredAccount, field_mask: str) -> LlmProxyPartnerPoweredAccount
26+
27+
Update the enable partner powered AI features account setting.
28+
29+
Updates the enable partner powered AI features account setting.
30+
31+
:param allow_missing: bool
32+
This should always be set to true for Settings API. Added for AIP compliance.
33+
:param setting: :class:`LlmProxyPartnerPoweredAccount`
34+
:param field_mask: str
35+
The field mask must be a single string, with multiple fields separated by commas (no spaces). The
36+
field path is relative to the resource object, using a dot (`.`) to navigate sub-fields (e.g.,
37+
`author.given_name`). Specification of elements in sequence or map fields is not allowed, as only
38+
the entire collection field can be specified. Field names must exactly match the resource field
39+
names.
40+
41+
A field mask of `*` indicates full replacement. It’s recommended to always explicitly list the
42+
fields being updated and avoid using `*` wildcards, as it can lead to unintended results if the API
43+
changes in the future.
44+
45+
:returns: :class:`LlmProxyPartnerPoweredAccount`
46+
Lines changed: 47 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,47 @@
1+
``a.settings.llm_proxy_partner_powered_enforce``: Enable Enforcement of Partner Powered AI Features
2+
===================================================================================================
3+
.. currentmodule:: databricks.sdk.service.settings
4+
5+
.. py:class:: LlmProxyPartnerPoweredEnforceAPI
6+
7+
Determines if the account-level partner-powered setting value is enforced upon the workspace-level
8+
partner-powered setting
9+
10+
.. py:method:: get( [, etag: Optional[str]]) -> LlmProxyPartnerPoweredEnforce
11+
12+
Get the enforcement status of partner powered AI features account setting.
13+
14+
Gets the enforcement status of partner powered AI features account setting.
15+
16+
:param etag: str (optional)
17+
etag used for versioning. The response is at least as fresh as the eTag provided. This is used for
18+
optimistic concurrency control as a way to help prevent simultaneous writes of a setting overwriting
19+
each other. It is strongly suggested that systems make use of the etag in the read -> delete pattern
20+
to perform setting deletions in order to avoid race conditions. That is, get an etag from a GET
21+
request, and pass it with the DELETE request to identify the rule set version you are deleting.
22+
23+
:returns: :class:`LlmProxyPartnerPoweredEnforce`
24+
25+
26+
.. py:method:: update(allow_missing: bool, setting: LlmProxyPartnerPoweredEnforce, field_mask: str) -> LlmProxyPartnerPoweredEnforce
27+
28+
Update the enforcement status of partner powered AI features account setting.
29+
30+
Updates the enable enforcement status of partner powered AI features account setting.
31+
32+
:param allow_missing: bool
33+
This should always be set to true for Settings API. Added for AIP compliance.
34+
:param setting: :class:`LlmProxyPartnerPoweredEnforce`
35+
:param field_mask: str
36+
The field mask must be a single string, with multiple fields separated by commas (no spaces). The
37+
field path is relative to the resource object, using a dot (`.`) to navigate sub-fields (e.g.,
38+
`author.given_name`). Specification of elements in sequence or map fields is not allowed, as only
39+
the entire collection field can be specified. Field names must exactly match the resource field
40+
names.
41+
42+
A field mask of `*` indicates full replacement. It’s recommended to always explicitly list the
43+
fields being updated and avoid using `*` wildcards, as it can lead to unintended results if the API
44+
changes in the future.
45+
46+
:returns: :class:`LlmProxyPartnerPoweredEnforce`
47+
Lines changed: 73 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,73 @@
1+
``a.network_policies``: Network Policies
2+
========================================
3+
.. currentmodule:: databricks.sdk.service.settings
4+
5+
.. py:class:: NetworkPoliciesAPI
6+
7+
These APIs manage network policies for this account. Network policies control which network destinations
8+
can be accessed from the Databricks environment. Each Databricks account includes a default policy named
9+
'default-policy'. 'default-policy' is associated with any workspace lacking an explicit network policy
10+
assignment, and is automatically associated with each newly created workspace. 'default-policy' is
11+
reserved and cannot be deleted, but it can be updated to customize the default network access rules for
12+
your account.
13+
14+
.. py:method:: create_network_policy_rpc(network_policy: AccountNetworkPolicy) -> AccountNetworkPolicy
15+
16+
Create a network policy.
17+
18+
Creates a new network policy to manage which network destinations can be accessed from the Databricks
19+
environment.
20+
21+
:param network_policy: :class:`AccountNetworkPolicy`
22+
23+
:returns: :class:`AccountNetworkPolicy`
24+
25+
26+
.. py:method:: delete_network_policy_rpc(network_policy_id: str)
27+
28+
Delete a network policy.
29+
30+
Deletes a network policy. Cannot be called on 'default-policy'.
31+
32+
:param network_policy_id: str
33+
The unique identifier of the network policy to delete.
34+
35+
36+
37+
38+
.. py:method:: get_network_policy_rpc(network_policy_id: str) -> AccountNetworkPolicy
39+
40+
Get a network policy.
41+
42+
Gets a network policy.
43+
44+
:param network_policy_id: str
45+
The unique identifier of the network policy to retrieve.
46+
47+
:returns: :class:`AccountNetworkPolicy`
48+
49+
50+
.. py:method:: list_network_policies_rpc( [, page_token: Optional[str]]) -> Iterator[AccountNetworkPolicy]
51+
52+
List network policies.
53+
54+
Gets an array of network policies.
55+
56+
:param page_token: str (optional)
57+
Pagination token to go to next page based on previous query.
58+
59+
:returns: Iterator over :class:`AccountNetworkPolicy`
60+
61+
62+
.. py:method:: update_network_policy_rpc(network_policy_id: str, network_policy: AccountNetworkPolicy) -> AccountNetworkPolicy
63+
64+
Update a network policy.
65+
66+
Updates a network policy. This allows you to modify the configuration of a network policy.
67+
68+
:param network_policy_id: str
69+
The unique identifier for the network policy.
70+
:param network_policy: :class:`AccountNetworkPolicy`
71+
72+
:returns: :class:`AccountNetworkPolicy`
73+

0 commit comments

Comments
 (0)