You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
*`bucket` - (Required) AWS S3 Bucket name for which to generate the policy document.
76
80
*`full_access_role` - (Optional) Data access role that can have full access for this bucket
77
-
*`databricks_e2_account_id` - (Optional) Your Databricks E2 account ID. Used to generate restrictive IAM policies that will increase the security of your root bucket
81
+
*`databricks_e2_account_id` - (Optional) Your Databricks E2 account ID. Used to generate restrictive IAM policies that will increase the security of your root bucket
-[Host and Token outputs](#provider-configuration)
49
50
50
51
> Initialize provider with `alias = "mws"` and use `provider = databricks.mws` for all `databricks_mws_*` resources. We require all `databricks_mws_*` resources to be created within its own dedicated terraform module of your environment. Usually this module creates VPC and IAM roles as well.
51
52
@@ -203,9 +204,6 @@ Once [VPC](#vpc) is ready, create AWS S3 bucket for DBFS workspace storage, whic
We assume that you have a terraform module in your project that creats a workspace (using [Databricks E2 Workspace](#databricks-e2-workspace) section) and you named it as `e2` while calling it in the **main.tf** file of your terraform project. And `workspace_url` and `token_value` are the output attributes of that module. This provider configuration will allow you to use the generated token during workspace creation to authenticate to the created workspace.
307
311
312
+
We assume that you have a terraform module in your project that creats a workspace (using [Databricks E2 Workspace](#databricks-e2-workspace) section) and you named it as `e2` while calling it in the **main.tf** file of your terraform project. And `workspace_url` and `token_value` are the output attributes of that module. This provider configuration will allow you to use the generated token during workspace creation to authenticate to the created workspace.
308
313
309
314
### Credentials validation checks errors
310
315
311
316
Due to a bug in the Terraform AWS provider (spotted in v3.28) the Databricks AWS cross-account policy creation and attachment to the IAM role takes longer than the AWS request confirmation to Terraform. As Terraform continues creating the Workspace, validation checks for the credentials are failing, as the policy doesn't get applied quick enough. Showing the error:
Copy file name to clipboardExpand all lines: docs/guides/unity-catalog.md
+15-7Lines changed: 15 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -132,15 +132,12 @@ The first step is to create the required AWS objects:
132
132
133
133
- An S3 bucket, which is the default storage location for managed tables in Unity Catalog. Please use a dedicated bucket for each metastore.
134
134
- An IAM policy that provides Unity Catalog permissions to access and manage data in the bucket. Note that `<KMS_KEY>` is *optional*. If encryption is enabled, provide the name of the KMS key that encrypts the S3 bucket contents. *If encryption is disabled, remove the entire KMS section of the IAM policy.*
135
-
- An IAM role that is associated with the IAM policy and will be assumed by Unity Catalog.
135
+
- An IAM role that is associated with the IAM policy and will be assumed by Unity Catalog.
-> **Public Preview** This feature is in [Public Preview](https://docs.databricks.com/release-notes/release-types.html) on GCP.
108
-
109
107
```hcl
110
108
variable "databricks_account_id" {
111
109
description = "Account Id that could be found in the bottom left corner of https://accounts.cloud.databricks.com/"
@@ -231,5 +229,5 @@ The following resources are used in the same context:
231
229
*[Provisioning Databricks on GCP](../guides/gcp-workspace.md) guide.
232
230
*[Provisioning Databricks workspaces on GCP with Private Service Connect](../guides/gcp-private-service-connect-workspace.md) guide.
233
231
*[databricks_mws_vpc_endpoint](mws_vpc_endpoint.md) to register [aws_vpc_endpoint](https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/vpc_endpoint) resources with Databricks such that they can be used as part of a [databricks_mws_networks](mws_networks.md) configuration.
234
-
*[databricks_mws_private_access_settings](mws_private_access_settings.md) to create a Private Access Setting that can be used as part of a [databricks_mws_workspaces](mws_workspaces.md) resource to create a [Databricks Workspace that leverages AWS PrivateLink](https://docs.databricks.com/administration-guide/cloud-configurations/aws/privatelink.html) or [GCP Private Service Connect](https://docs.gcp.databricks.com/administration-guide/cloud-configurations/gcp/private-service-connect.html).
232
+
*[databricks_mws_private_access_settings](mws_private_access_settings.md) to create a Private Access Setting that can be used as part of a [databricks_mws_workspaces](mws_workspaces.md) resource to create a [Databricks Workspace that leverages AWS PrivateLink](https://docs.databricks.com/administration-guide/cloud-configurations/aws/privatelink.html) or [GCP Private Service Connect](https://docs.gcp.databricks.com/administration-guide/cloud-configurations/gcp/private-service-connect.html).
235
233
*[databricks_mws_workspaces](mws_workspaces.md) to set up [workspaces in E2 architecture on AWS](https://docs.databricks.com/getting-started/overview.html#e2-architecture-1).
Copy file name to clipboardExpand all lines: docs/resources/volume.md
+3Lines changed: 3 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -3,6 +3,8 @@ subcategory: "Unity Catalog"
3
3
---
4
4
# databricks_volume (Resource)
5
5
6
+
-> **Public Preview** This feature is in [Public Preview](https://docs.databricks.com/release-notes/release-types.html).
7
+
6
8
Volumes are Unity Catalog objects representing a logical volume of storage in a cloud object storage location. Volumes provide capabilities for accessing, storing, governing, and organizing files. While tables provide governance over tabular datasets, volumes add governance over non-tabular datasets. You can use volumes to store and access files in any format, including structured, semi-structured, and unstructured data.
7
9
8
10
A volume resides in the third layer of Unity Catalog’s three-level namespace. Volumes are siblings to tables, views, and other objects organized under a schema in Unity Catalog.
@@ -14,6 +16,7 @@ A **managed volume** is a Unity Catalog-governed storage volume created within t
14
16
An **external volume** is a Unity Catalog-governed storage volume registered against a directory within an external location.
15
17
16
18
A volume can be referenced using its identifier: ```<catalogName>.<schemaName>.<volumeName>```, where:
19
+
17
20
*```<catalogName>```: The name of the catalog containing the Volume.
18
21
*```<schemaName>```: The name of the schema containing the Volume.
19
22
*```<volumeName>```: The name of the Volume. It identifies the volume object.
0 commit comments