Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 3 additions & 1 deletion NEXT_CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@

* Add `databricks_users` data source ([#4028](https://github.com/databricks/terraform-provider-databricks/pull/4028))
* Improve `databricks_service_principals` data source ([#5164](https://github.com/databricks/terraform-provider-databricks/pull/5164))
* Deprecate `databricks_metastore_data_access` resource in favor of using `databricks_storage_credential` with `storage_root_credential_id` on `databricks_metastore` ([#5239](https://github.com/databricks/terraform-provider-databricks/issues/5239)).

### Bug Fixes

Expand All @@ -18,7 +19,8 @@

* Document tag policies in `databricks_access_control_rule_set` ([#5209](https://github.com/databricks/terraform-provider-databricks/pull/5209)).
* Document missing `aws_attributes.ebs_*` properties in `databricks_cluster` ([#5196](https://github.com/databricks/terraform-provider-databricks/pull/5196)).
* Document support for serverless workspaces on GCP ([#5124](https://github.com/databricks/terraform-provider-databricks/pull/5124))
* Document support for serverless workspaces on GCP ([#5124](https://github.com/databricks/terraform-provider-databricks/pull/5124)).
* Fix Unity Catalog GCP guide by removing references to non-existent resources ([#2156](https://github.com/databricks/terraform-provider-databricks/issues/2156))

### Exporter

Expand Down
3 changes: 3 additions & 0 deletions catalog/resource_metastore_data_access.go
Original file line number Diff line number Diff line change
Expand Up @@ -109,6 +109,9 @@ func ResourceMetastoreDataAccess() common.Resource {
return common.Resource{
Schema: dacSchema,
SchemaVersion: 1,
DeprecationMessage: "This resource is deprecated. Please use `databricks_storage_credential` " +
"and set it as `storage_root_credential_id` on the `databricks_metastore` resource instead. " +
"See https://docs.databricks.com/api-explorer/workspace/metastores/create for more details.",
StateUpgraders: []schema.StateUpgrader{
{
Version: 0,
Expand Down
16 changes: 2 additions & 14 deletions docs/guides/unity-catalog-gcp.md
Original file line number Diff line number Diff line change
Expand Up @@ -101,18 +101,6 @@ resource "databricks_metastore" "this" {
force_destroy = true
}

resource "google_storage_bucket_iam_member" "unity_sa_admin" {
bucket = google_storage_bucket.unity_metastore.name
role = "roles/storage.objectAdmin"
member = "serviceAccount:${databricks_metastore_data_access.first.databricks_gcp_service_account[0].email}"
}

resource "google_storage_bucket_iam_member" "unity_sa_reader" {
bucket = google_storage_bucket.unity_metastore.name
role = "roles/storage.legacyBucketReader"
member = "serviceAccount:${databricks_metastore_data_access.first.databricks_gcp_service_account[0].email}"
}

resource "databricks_metastore_assignment" "this" {
provider = databricks.accounts
workspace_id = var.databricks_workspace_id
Expand All @@ -127,7 +115,7 @@ Unity Catalog introduces two new objects to access and work with external cloud
- [databricks_storage_credential](../resources/storage_credential.md) represent authentication methods to access cloud storage. Storage credentials are access-controlled to determine which users can use the credential.
- [databricks_external_location](../resources/external_location.md) are objects that combine a cloud storage path with a Storage Credential that can be used to access the location.

First, create the required object in GCPs, including granting permissions on the bucket to the Databricks-managed Service Account.
First, create the required object in GCPs, including granting permissions on the bucket to the Databricks-managed Service Account created by the [databricks_storage_credential](../resources/storage_credential.md).

```hcl
resource "google_storage_bucket" "ext_bucket" {
Expand Down Expand Up @@ -189,7 +177,7 @@ resource "google_storage_bucket_iam_member" "unity_cred_reader" {
}
```

Then create the [databricks_storage_credential](../resources/storage_credential.md) and [databricks_external_location](../resources/external_location.md) in Unity Catalog.
Then create [databricks_external_location](../resources/external_location.md) in Unity Catalog.

```hcl
resource "databricks_grants" "external_creds" {
Expand Down
48 changes: 48 additions & 0 deletions docs/resources/metastore_data_access.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,10 +3,58 @@ subcategory: "Unity Catalog"
---
# databricks_metastore_data_access (Resource)

!> **DEPRECATED** This resource is deprecated. Please use [databricks_storage_credential](storage_credential.md) and set it as `storage_root_credential_id` on the [databricks_metastore](metastore.md) resource instead. See the [Unity Catalog API documentation](https://docs.databricks.com/api-explorer/workspace/metastores/create) for more details.

-> This resource can be used with an account or workspace-level provider.

Optionally, each [databricks_metastore](metastore.md) can have a default [databricks_storage_credential](storage_credential.md) defined as `databricks_metastore_data_access`. This will be used by Unity Catalog to access data in the root storage location if defined.

## Migration to databricks_storage_credential

Instead of using `databricks_metastore_data_access`, you should create a [databricks_storage_credential](storage_credential.md) and reference it in your metastore configuration using the `storage_root_credential_id` attribute.

**Old approach (deprecated):**

```hcl
resource "databricks_metastore" "this" {
name = "primary"
storage_root = "s3://${aws_s3_bucket.metastore.id}/metastore"
owner = "uc admins"
region = "us-east-1"
force_destroy = true
}

resource "databricks_metastore_data_access" "this" {
metastore_id = databricks_metastore.this.id
name = aws_iam_role.metastore_data_access.name
aws_iam_role {
role_arn = aws_iam_role.metastore_data_access.arn
}
is_default = true
}
```

**New approach (although the use of `storage_root` isn't recommended anymore):**

```hcl
resource "databricks_storage_credential" "this" {
name = aws_iam_role.metastore_data_access.name
aws_iam_role {
role_arn = aws_iam_role.metastore_data_access.arn
}
comment = "Managed by TF"
}

resource "databricks_metastore" "this" {
name = "primary"
storage_root = "s3://${aws_s3_bucket.metastore.id}/metastore"
owner = "uc admins"
region = "us-east-1"
force_destroy = true
storage_root_credential_id = databricks_storage_credential.this.id
}
```

## Example Usage

For AWS
Expand Down
Loading