Skip to content

Commit deffe04

Browse files
authored
misc doc fixes (#2166)
* misc doc fixes * one more fix
1 parent 9451a90 commit deffe04

File tree

11 files changed

+78
-59
lines changed

11 files changed

+78
-59
lines changed

docs/data-sources/aws_crossaccount_policy.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ This data source constructs necessary AWS cross-account policy for you, which is
99

1010
## Example Usage
1111

12-
For more detailed usage please see [databricks_aws_assume_role_policy](aws_assume_role_policy.md) or [databricks_aws_s3_mount](../resources/aws_s3_mount.md) pages.
12+
For more detailed usage please see [databricks_aws_assume_role_policy](aws_assume_role_policy.md) or [databricks_aws_s3_mount](../resources/mount.md) pages.
1313

1414
```hcl
1515
data "databricks_aws_crossaccount_policy" "this" {}

docs/data-sources/service_principal.md

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -48,11 +48,11 @@ Data source exposes the following attributes:
4848

4949
The following resources are used in the same context:
5050

51-
* [End to end workspace management](../guides/passthrough-cluster-per-user.md) guide.
52-
* [databricks_current_user](current_user.md) data to retrieve information about [databricks_user](../resources/user.md) or [databricks_service_principal](../resources/service_principal.md), that is calling Databricks REST API.
53-
* [databricks_group](../resources/group.md) to manage [groups in Databricks Workspace](https://docs.databricks.com/administration-guide/users-groups/groups.html) or [Account Console](https://accounts.cloud.databricks.com/) (for AWS deployments).
54-
* [databricks_group](group.md) data to retrieve information about [databricks_group](../resources/group.md) members, entitlements and instance profiles.
55-
* [databricks_group_instance_profile](../resources/group_instance_profile.md) to attach [databricks_instance_profile](../resources/instance_profile.md) (AWS) to [databricks_group](../resources/group.md).
56-
* [databricks_group_member](../resources/group_member.md) to attach [users](../resources/user.md) and [groups](../resources/group.md) as group members.
57-
* [databricks_permissions](../resources/permissions.md) to manage [access control](https://docs.databricks.com/security/access-control/index.html) in Databricks workspace.
58-
* [databricks_service principal](../resources/service_principal.md) to manage [service principals](../resources/service_principal.md)
51+
- [End to end workspace management](../guides/passthrough-cluster-per-user.md) guide.
52+
- [databricks_current_user](current_user.md) data to retrieve information about [databricks_user](../resources/user.md) or [databricks_service_principal](../resources/service_principal.md), that is calling Databricks REST API.
53+
- [databricks_group](../resources/group.md) to manage [groups in Databricks Workspace](https://docs.databricks.com/administration-guide/users-groups/groups.html) or [Account Console](https://accounts.cloud.databricks.com/) (for AWS deployments).
54+
- [databricks_group](group.md) data to retrieve information about [databricks_group](../resources/group.md) members, entitlements and instance profiles.
55+
- [databricks_group_instance_profile](../resources/group_instance_profile.md) to attach [databricks_instance_profile](../resources/instance_profile.md) (AWS) to [databricks_group](../resources/group.md).
56+
- [databricks_group_member](../resources/group_member.md) to attach [users](../resources/user.md) and [groups](../resources/group.md) as group members.
57+
- [databricks_permissions](../resources/permissions.md) to manage [access control](https://docs.databricks.com/security/access-control/index.html) in Databricks workspace.
58+
- [databricks_service principal](../resources/service_principal.md) to manage [service principals](../resources/service_principal.md)

docs/guides/aws-private-link-workspace.md

Lines changed: 30 additions & 28 deletions
Original file line numberDiff line numberDiff line change
@@ -127,6 +127,8 @@ The first step is to create the required AWS objects:
127127
- A subnet dedicated to your VPC endpoints.
128128
- A security group dedicated to your VPC endpoints and satisfying required inbound/outbound TCP/HTTPS traffic rules on ports 443 and 6666, respectively.
129129

130+
For workspace with [compliance security profile](https://docs.databricks.com/security/privacy/security-profile.html#prepare-a-workspace-for-the-compliance-security-profile), you need *additionally* allow bidirectional access to port 2443 for FIPS connections. The total set of ports to allow bidirectional access are 443, 2443, and 6666.
131+
130132
```hcl
131133
data "aws_vpc" "prod" {
132134
id = var.vpc_id
@@ -176,36 +178,36 @@ resource "aws_security_group" "dataplane_vpce" {
176178
description = "Security group shared with relay and workspace endpoints"
177179
vpc_id = var.vpc_id
178180
179-
ingress {
180-
description = "Inbound rules"
181-
from_port = 443
182-
to_port = 443
183-
protocol = "tcp"
184-
cidr_blocks = concat([var.vpce_subnet_cidr], local.vpc_cidr_blocks)
185-
}
186-
187-
ingress {
188-
description = "Inbound rules"
189-
from_port = 6666
190-
to_port = 6666
191-
protocol = "tcp"
192-
cidr_blocks = concat([var.vpce_subnet_cidr], local.vpc_cidr_blocks)
193-
}
194-
195-
egress {
196-
description = "Outbound rules"
197-
from_port = 443
198-
to_port = 443
199-
protocol = "tcp"
200-
cidr_blocks = concat([var.vpce_subnet_cidr], local.vpc_cidr_blocks)
181+
dynamic "ingress" {
182+
for_each = toset([
183+
443,
184+
2443, # FIPS port for CSP
185+
6666,
186+
])
187+
188+
content {
189+
description = "Inbound rules"
190+
from_port = ingress.value
191+
to_port = ingress.value
192+
protocol = "tcp"
193+
cidr_blocks = concat([var.vpce_subnet_cidr], local.vpc_cidr_blocks)
194+
}
201195
}
202196
203-
egress {
204-
description = "Outbound rules"
205-
from_port = 6666
206-
to_port = 6666
207-
protocol = "tcp"
208-
cidr_blocks = concat([var.vpce_subnet_cidr], local.vpc_cidr_blocks)
197+
dynamic "egress" {
198+
for_each = toset([
199+
443,
200+
2443, # FIPS port for CSP
201+
6666,
202+
])
203+
204+
content {
205+
description = "Outbound rules"
206+
from_port = egress.value
207+
to_port = egress.value
208+
protocol = "tcp"
209+
cidr_blocks = concat([var.vpce_subnet_cidr], local.vpc_cidr_blocks)
210+
}
209211
}
210212
211213
tags = merge(data.aws_vpc.prod.tags, {

docs/guides/unity-catalog-azure.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -168,11 +168,11 @@ resource "databricks_grants" "sandbox" {
168168
catalog = databricks_catalog.sandbox.name
169169
grant {
170170
principal = "Data Scientists"
171-
privileges = ["USAGE", "CREATE"]
171+
privileges = ["USE_CATALOG", "CREATE"]
172172
}
173173
grant {
174174
principal = "Data Engineers"
175-
privileges = ["USAGE"]
175+
privileges = ["USE_CATALOG"]
176176
}
177177
}
178178
@@ -189,7 +189,7 @@ resource "databricks_grants" "things" {
189189
schema = databricks_schema.things.id
190190
grant {
191191
principal = "Data Engineers"
192-
privileges = ["USAGE"]
192+
privileges = ["USE_SCHEMA"]
193193
}
194194
}
195195
```
@@ -260,7 +260,7 @@ resource "databricks_grants" "external_creds" {
260260
261261
resource "databricks_external_location" "some" {
262262
name = "external"
263-
url = format("abfss://%s@%s.dfs.core.windows.net/",
263+
url = format("abfss://%s@%s.dfs.core.windows.net",
264264
azurerm_storage_container.ext_storage.name,
265265
azurerm_storage_account.ext_storage.name)
266266

docs/guides/unity-catalog-gcp.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -146,11 +146,11 @@ resource "databricks_grants" "sandbox" {
146146
catalog = databricks_catalog.sandbox.name
147147
grant {
148148
principal = "Data Scientists"
149-
privileges = ["USAGE", "CREATE"]
149+
privileges = ["USE_CATALOG", "CREATE"]
150150
}
151151
grant {
152152
principal = "Data Engineers"
153-
privileges = ["USAGE"]
153+
privileges = ["USE_CATALOG"]
154154
}
155155
}
156156
@@ -167,7 +167,7 @@ resource "databricks_grants" "things" {
167167
schema = databricks_schema.things.id
168168
grant {
169169
principal = "Data Engineers"
170-
privileges = ["USAGE"]
170+
privileges = ["USE_SCHEMA"]
171171
}
172172
}
173173
```

docs/guides/unity-catalog.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -338,11 +338,11 @@ resource "databricks_grants" "sandbox" {
338338
catalog = databricks_catalog.sandbox.name
339339
grant {
340340
principal = "Data Scientists"
341-
privileges = ["USAGE", "CREATE"]
341+
privileges = ["USE_CATALOG", "CREATE"]
342342
}
343343
grant {
344344
principal = "Data Engineers"
345-
privileges = ["USAGE"]
345+
privileges = ["USE_CATALOG"]
346346
}
347347
}
348348
@@ -361,7 +361,7 @@ resource "databricks_grants" "things" {
361361
schema = databricks_schema.things.id
362362
grant {
363363
principal = "Data Engineers"
364-
privileges = ["USAGE"]
364+
privileges = ["USE_SCHEMA"]
365365
}
366366
}
367367
```

docs/index.md

Lines changed: 8 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ description: Terraform provider for the Databricks Lakehouse platform
77

88
# Databricks Provider
99

10-
Use the Databricks Terraform provider to interact with almost all of [Databricks](http://databricks.com/) resources. If you're new to Databricks, please follow guide to create a workspace on [Azure](guides/azure-workspace.md) or [AWS](guides/aws-workspace.md) and then this [workspace management](guides/workspace-management.md) tutorial. Changelog is available [on GitHub](https://github.com/databricks/terraform-provider-databricks/blob/master/CHANGELOG.md).
10+
Use the Databricks Terraform provider to interact with almost all of [Databricks](http://databricks.com/) resources. If you're new to Databricks, please follow guide to create a workspace on [Azure](guides/azure-workspace.md), [AWS](guides/aws-workspace.md) or [GCP](guides/gcp-workspace.md) and then this [workspace management](guides/workspace-management.md) tutorial. Changelog is available [on GitHub](https://github.com/databricks/terraform-provider-databricks/blob/master/CHANGELOG.md).
1111

1212
![Resources](https://github.com/databricks/terraform-provider-databricks/raw/master/docs/resources.png)
1313

@@ -291,7 +291,13 @@ When a workspace is created using a service principal account, that service prin
291291

292292
## Special configurations for GCP
293293

294-
The provider works with [Google Cloud CLI authentication](https://cloud.google.com/sdk/docs/authorizing) to facilitate local development workflows. For automated scenarios, a service principal auth is necessary using `google_service_account` parameter with [impersonation](https://cloud.google.com/docs/authentication#service-accounts) and Application Default Credentials. and specification of and `google_credentials` parameters). Alternatively, you could provide the service account key directly by passing it to `google_credentials` parameter (or `GOOGLE_CREDENTIALS` environment variable)
294+
The provider works with [Google Cloud CLI authentication](https://cloud.google.com/sdk/docs/authorizing) to facilitate local development workflows. For automated scenarios, a service principal auth is necessary using `google_service_account` parameter with [impersonation](https://cloud.google.com/docs/authentication#service-accounts) and Application Default Credentials. Alternatively, you could provide the service account key directly by passing it to `google_credentials` parameter (or `GOOGLE_CREDENTIALS` environment variable)
295+
296+
## Special configuration for Unity Catalog
297+
298+
Unity Catalog APIs are accessible via **workspace-level APIs**. This design may change in the future.
299+
300+
If you are configuring a new Databricks account for the first time, please create at least one workspace and with an identity (user or service principal) that you intend to use for Unity Catalog rollout. You can then configure the provider using that identity and workspace to provision the required Unity Catalog resources.
295301

296302
## Miscellaneous configuration parameters
297303

docs/resources/external_location.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -55,7 +55,7 @@ resource "databricks_storage_credential" "external" {
5555
5656
resource "databricks_external_location" "some" {
5757
name = "external"
58-
url = format("abfss://%s@%s.dfs.core.windows.net/",
58+
url = format("abfss://%s@%s.dfs.core.windows.net",
5959
azurerm_storage_container.ext_storage.name,
6060
azurerm_storage_account.ext_storage.name)
6161
credential_name = databricks_storage_credential.external.id

docs/resources/instance_profile.md

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -108,8 +108,10 @@ resource "databricks_group_instance_profile" "all" {
108108
instance_profile_id = databricks_instance_profile.this.id
109109
}
110110
```
111+
111112
## Usage with Databricks SQL serverless
112-
When the instance profile ARN and its associated IAM role ARN don't match and the instance profile is intended for use with Databricks SQL serverless, the `iam_role_arn` parameter can be specified
113+
114+
When the instance profile ARN and its associated IAM role ARN don't match and the instance profile is intended for use with Databricks SQL serverless, the `iam_role_arn` parameter can be specified.
113115

114116
```hcl
115117
data "aws_iam_policy_document" "sql_serverless_assume_role" {
@@ -166,5 +168,5 @@ In addition to all arguments above, the following attributes are exported:
166168
The resource instance profile can be imported using the ARN of it
167169

168170
```bash
169-
$ terraform import databricks_instance_profile.this <instance-profile-arn>
171+
terraform import databricks_instance_profile.this <instance-profile-arn>
170172
```

docs/resources/mws_workspaces.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -344,7 +344,8 @@ On AWS, the following arguments could be modified after the workspace is running
344344

345345
In addition to all arguments above, the following attributes are exported:
346346

347-
* `id` - Canonical unique identifier for the workspace.
347+
* `id` - (String) Canonical unique identifier for the workspace, of the format `<account-id>/<workspace-id>`
348+
* `workspace_id` - (String) workspace id
348349
* `workspace_status_message` - (String) updates on workspace status
349350
* `workspace_status` - (String) workspace status
350351
* `creation_time` - (Integer) time when workspace was created

0 commit comments

Comments
 (0)