Skip to content

Commit 18b0585

Browse files
authored
doc fixes (#2987)
1 parent 40d08fa commit 18b0585

File tree

10 files changed

+25
-18
lines changed

10 files changed

+25
-18
lines changed

docs/guides/aws-private-link-workspace.md

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -300,5 +300,3 @@ resource "databricks_mws_workspaces" "this" {
300300
depends_on = [databricks_mws_networks.this]
301301
}
302302
```
303-
304-

docs/guides/gcp-private-service-connect-workspace.md

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -15,6 +15,7 @@ To work with Databricks in GCP in an automated way, please create a service acco
1515
The very first step is VPC creation with the necessary resources. Please consult [main documentation page](https://docs.gcp.databricks.com/administration-guide/cloud-configurations/gcp/customer-managed-vpc.html) for **the most complete and up-to-date details on networking**. A GCP VPC is registered as [databricks_mws_networks](../resources/mws_networks.md) resource.
1616

1717
To enable [back-end Private Service Connect (data plane to control plane)](https://docs.gcp.databricks.com/administration-guide/cloud-configurations/gcp/private-service-connect.html#two-private-service-connect-options), configure the network with the two back-end VPC endpoints:
18+
1819
- Back-end VPC endpoint for [Secure cluster connectivity](https://docs.gcp.databricks.com/security/secure-cluster-connectivity.html) relay
1920
- Back-end VPC endpoint for REST APIs
2021

@@ -98,7 +99,7 @@ resource "databricks_mws_networks" "this" {
9899

99100
## Creating a Databricks Workspace
100101

101-
Once [the VPC](#creating-a-vpc) is set up, you can create Databricks workspace through [databricks_mws_workspaces](../resources/mws_workspaces.md) resource.
102+
Once [the VPC](#creating-a-vpc-network) is set up, you can create Databricks workspace through [databricks_mws_workspaces](../resources/mws_workspaces.md) resource.
102103

103104
For a workspace to support any of the Private Service Connect connectivity scenarios, the workspace must be created with an attached [databricks_mws_private_access_settings](../resources/mws_private_access_settings.md) resource.
104105

@@ -126,7 +127,7 @@ resource "databricks_mws_workspaces" "this" {
126127
}
127128
}
128129
129-
private_service_connect_id = databricks_mws_private_access_settings.pas.private_access_settings_id
130+
private_access_settings_id = databricks_mws_private_access_settings.pas.private_access_settings_id
130131
network_id = databricks_mws_networks.this.network_id
131132
gke_config {
132133
connectivity_type = "PRIVATE_NODE_PUBLIC_MASTER"

docs/guides/unity-catalog-azure.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,9 @@ page_title: "Unity Catalog set up on Azure"
77
**Note**
88
If your workspace was enabled for Unity Catalog automatically, this guide does not apply to you.
99

10+
**Note**
11+
Except for metastore, metastore assignment and storage credential objects, Unity Catalog APIs are accessible via **workspace-level APIs**. This design may change in the future.
12+
1013
Databricks Unity Catalog brings fine-grained governance and security to Lakehouse data using a familiar, open interface. You can use Terraform to deploy the underlying cloud resources and Unity Catalog objects automatically, using a programmatic approach.
1114

1215
This guide creates a metastore without a storage root location or credential to maintain strict separation of storage across catalogs or environments.

docs/guides/unity-catalog-gcp.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,9 @@ page_title: "Unity Catalog set up on Google Cloud"
77
**Note**
88
If your workspace was enabled for Unity Catalog automatically, this guide does not apply to you.
99

10+
**Note**
11+
Except for metastore, metastore assignment and storage credential objects, Unity Catalog APIs are accessible via **workspace-level APIs**. This design may change in the future.
12+
1013
Databricks Unity Catalog brings fine-grained governance and security to Lakehouse data using a familiar, open interface. You can use Terraform to deploy the underlying cloud resources and Unity Catalog objects automatically, using a programmatic approach.
1114

1215
This guide creates a metastore without a storage root location or credential to maintain strict separation of storage across catalogs or environments.

docs/guides/unity-catalog.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,9 @@ page_title: "Unity Catalog set up on AWS"
77
**Note**
88
If your workspace was enabled for Unity Catalog automatically, this guide does not apply to you.
99

10+
**Note**
11+
Except for metastore, metastore assignment and storage credential objects, Unity Catalog APIs are accessible via **workspace-level APIs**. This design may change in the future.
12+
1013
Databricks Unity Catalog brings fine-grained governance and security to Lakehouse data using a familiar, open interface. You can use Terraform to deploy the underlying cloud resources and Unity Catalog objects automatically, using a programmatic approach.
1114

1215
This guide creates a metastore without a storage root location or credential to maintain strict separation of storage across catalogs or environments.

docs/index.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -139,7 +139,7 @@ There are currently a number of supported methods to [authenticate](https://docs
139139

140140
### Authenticating with Databricks CLI credentials
141141

142-
No configuration options given to your provider will look up configured credentials in `~/.databrickscfg` file. It is created by the `databricks configure --token` command. Check [this page](https://docs.databricks.com/dev-tools/cli/index.html#set-up-authentication)
142+
If no configuration option is given, the provider will look up configured credentials in `~/.databrickscfg` file. It is created by the `databricks configure --token` command. Check [this page](https://docs.databricks.com/dev-tools/cli/index.html#set-up-authentication)
143143
for more details. The provider uses config file credentials only when `host`/`token` or `azure_auth` options are not specified.
144144
It is the recommended way to use Databricks Terraform provider, in case you're already using the same approach with
145145
[AWS Shared Credentials File](https://www.terraform.io/docs/providers/aws/index.html#shared-credentials-file)
@@ -257,7 +257,7 @@ resource "databricks_group" "cluster_admin" {
257257

258258
## Special configurations for Azure
259259

260-
The provider works with [Azure CLI authentication](https://docs.microsoft.com/en-us/cli/azure/authenticate-azure-cli?view=azure-cli-latest) to facilitate local development workflows, though for automated scenarios a service principal auth is necessary (and specification of `azure_use_msi`, `azure_client_id`, `azure_client_secret` and `azure_tenant_id` parameters).
260+
The below Azure authentication options are supported at both the account and workspace levels. The provider works with [Azure CLI authentication](https://docs.microsoft.com/en-us/cli/azure/authenticate-azure-cli?view=azure-cli-latest) to facilitate local development workflows, though for automated scenarios, managed identity or service principal auth is recommended (and specification of `azure_use_msi`, `azure_client_id`, `azure_client_secret` and `azure_tenant_id` parameters).
261261

262262
### Authenticating with Azure MSI
263263

@@ -349,7 +349,7 @@ The provider works with [Google Cloud CLI authentication](https://cloud.google.c
349349

350350
Except for metastore, metastore assignment and storage credential objects, Unity Catalog APIs are accessible via **workspace-level APIs**. This design may change in the future.
351351

352-
If you are configuring a new Databricks account for the first time, please create at least one workspace and with an identity (user or service principal) that you intend to use for Unity Catalog rollout. You can then configure the provider using that identity and workspace to provision the required Unity Catalog resources.
352+
If you are configuring a new Databricks account for the first time, please create at least one workspace with an identity (user or service principal) that you intend to use for Unity Catalog rollout. You can then configure the provider using that identity and workspace to provision the required Unity Catalog resources.
353353

354354
## Miscellaneous configuration parameters
355355

docs/resources/catalog.md

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -41,6 +41,7 @@ The following arguments are required:
4141
In addition to all arguments above, the following attributes are exported:
4242

4343
* `id` - ID of this catalog - same as the `name`.
44+
* `metastore_id` - ID of the metastore.
4445

4546
## Import
4647

@@ -54,6 +55,6 @@ terraform import databricks_catalog.this <name>
5455

5556
The following resources are used in the same context:
5657

57-
* [databricks_table](../data-sources/tables.md) data to list tables within Unity Catalog.
58-
* [databricks_schema](../data-sources/schemas.md) data to list schemas within Unity Catalog.
59-
* [databricks_catalog](../data-sources/catalogs.md) data to list catalogs within Unity Catalog.
58+
* [databricks_tables](../data-sources/tables.md) data to list tables within Unity Catalog.
59+
* [databricks_schemas](../data-sources/schemas.md) data to list schemas within Unity Catalog.
60+
* [databricks_catalogs](../data-sources/catalogs.md) data to list catalogs within Unity Catalog.

docs/resources/job.md

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -91,8 +91,6 @@ The resource supports the following arguments:
9191
```
9292

9393
* `library` - (Optional) (Set) An optional list of libraries to be installed on the cluster that will execute the job. Please consult [libraries section](cluster.md#libraries) for [databricks_cluster](cluster.md) resource.
94-
* `retry_on_timeout` - (Optional) (Bool) An optional policy to specify whether to retry a job when it times out. The default behavior is to not retry on timeout.
95-
* `max_retries` - (Optional) (Integer) An optional maximum number of times to retry an unsuccessful run. A run is considered to be unsuccessful if it completes with a `FAILED` or `INTERNAL_ERROR` lifecycle state. The value -1 means to retry indefinitely and the value 0 means to never retry. The default behavior is to never retry.
9694
* `timeout_seconds` - (Optional) (Integer) An optional timeout applied to each run of this job. The default behavior is to have no timeout.
9795
* `min_retry_interval_millis` - (Optional) (Integer) An optional minimal interval in milliseconds between the start of the failed run and the subsequent retry run. The default behavior is that unsuccessful runs are immediately retried.
9896
* `max_concurrent_runs` - (Optional) (Integer) An optional maximum allowed number of concurrent runs of the job. Defaults to *1*.

docs/resources/provider.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -46,6 +46,6 @@ In addition to all arguments above, the following attributes are exported:
4646

4747
The following resources are used in the same context:
4848

49-
* [databricks_table](../data-sources/tables.md) data to list tables within Unity Catalog.
50-
* [databricks_schema](../data-sources/schemas.md) data to list schemas within Unity Catalog.
51-
* [databricks_catalog](../data-sources/catalogs.md) data to list catalogs within Unity Catalog.
49+
* [databricks_tables](../data-sources/tables.md) data to list tables within Unity Catalog.
50+
* [databricks_schemas](../data-sources/schemas.md) data to list schemas within Unity Catalog.
51+
* [databricks_catalogs](../data-sources/catalogs.md) data to list catalogs within Unity Catalog.

docs/resources/schema.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -59,6 +59,6 @@ terraform import databricks_schema.this <catalog_name>.<name>
5959

6060
The following resources are used in the same context:
6161

62-
* [databricks_table](../data-sources/tables.md) data to list tables within Unity Catalog.
63-
* [databricks_schema](../data-sources/schemas.md) data to list schemas within Unity Catalog.
64-
* [databricks_catalog](../data-sources/catalogs.md) data to list catalogs within Unity Catalog.
62+
* [databricks_tables](../data-sources/tables.md) data to list tables within Unity Catalog.
63+
* [databricks_schemas](../data-sources/schemas.md) data to list schemas within Unity Catalog.
64+
* [databricks_catalogs](../data-sources/catalogs.md) data to list catalogs within Unity Catalog.

0 commit comments

Comments
 (0)