You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: CHANGELOG.md
+10Lines changed: 10 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -5,6 +5,16 @@
5
5
* Added `databricks_sql_global_config` resource to provide global configuration for SQL Endpoints ([#855](https://github.com/databrickslabs/terraform-provider-databricks/issues/855))
6
6
* Added `databricks_mount` resource to mount arbitrary cloud storage ([#497](https://github.com/databrickslabs/terraform-provider-databricks/issues/497))
7
7
* Improved implementation of `databricks_repo` by creating the parent folder structure ([#895](https://github.com/databrickslabs/terraform-provider-databricks/pull/895))
8
+
* Fixed `databricks_job` error related [to randomized job IDs](https://docs.databricks.com/release-notes/product/2021/august.html#jobs-service-stability-and-scalability-improvements) ([#901](https://github.com/databrickslabs/terraform-provider-databricks/issues/901))
9
+
* Replace `databricks_group` on name change ([#890](https://github.com/databrickslabs/terraform-provider-databricks/pull/890))
10
+
* Names of scopes in `databricks_secret_scope` can have `/` characters in them ([#892](https://github.com/databrickslabs/terraform-provider-databricks/pull/892))
11
+
12
+
**Deprecations**
13
+
*`databricks_aws_s3_mount`, `databricks_azure_adls_gen1_mount`, `databricks_azure_adls_gen2_mount`, and `databricks_azure_blob_mount` are deprecated in favor of `databricks_mount`.
14
+
15
+
Updated dependency versions:
16
+
17
+
* Bump google.golang.org/api from 0.59.0 to 0.60.0
Copy file name to clipboardExpand all lines: docs/resources/mount.md
+6-9Lines changed: 6 additions & 9 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -3,8 +3,6 @@ subcategory: "Storage"
3
3
---
4
4
# databricks_mount Resource
5
5
6
-
-> **Note** This resource has an evolving API, which may change in future versions of the provider.
7
-
8
6
This resource will mount your cloud storage account on `dbfs:/mnt/yourname`. Right now it supports mounting AWS S3, Azure (Blob Storage, ADLS Gen1 & Gen2), Google Cloud Storage. It is important to understand that this will start up the [cluster](cluster.md) if the cluster is terminated. The read and refresh terraform command will require a cluster and may take some time to validate the mount. If `cluster_id` is not specified, it will create the smallest possible cluster with name equal to or starting with `terraform-mount` for the shortest possible amount of time.
9
7
10
8
This resource provides two ways of mounting a storage account:
@@ -21,9 +19,9 @@ This resource provides two ways of mounting a storage account:
21
19
22
20
*`cluster_id` - (Optional, String) Cluster to use for mounting. If no cluster is specified, a new cluster will be created and will mount the bucket for all of the clusters in this workspace. If the cluster is not running - it's going to be started, so be aware to set auto-termination rules on it.
23
21
*`name` - (Optional, String) Name, under which mount will be accessible in `dbfs:/mnt/<MOUNT_NAME>`. If not specified, provider will try to infer it from depending on the resource type:
24
-
*bucket name for AWS S3 and Google Cloud Storage
25
-
*container name for ADLS Gen2 and Azure Blob Storage
26
-
*storage resource name for ADLS Gen1
22
+
*`bucket_name` for AWS S3 and Google Cloud Storage
23
+
*`container_name` for ADLS Gen2 and Azure Blob Storage
24
+
*`storage_resource_name` for ADLS Gen1
27
25
*`uri` - (Optional, String) the URI for accessing specific storage (`s3a://....`, `abfss://....`, `gs://....`, etc.)
28
26
*`extra_configs` - (Optional, String map) configuration parameters that are necessary for mounting of specific storage
29
27
*`resource_id` - (Optional, String) resource ID for given storage account. Could be used to fill defaults, such as storage account & container names on Azure.
@@ -33,8 +31,8 @@ This resource provides two ways of mounting a storage account:
Copy file name to clipboardExpand all lines: docs/resources/sql_global_config.md
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -5,7 +5,7 @@ subcategory: "Databricks SQL"
5
5
6
6
-> **Public Preview** This feature is in [Public Preview](https://docs.databricks.com/release-notes/release-types.html).
7
7
8
-
This resource configures the security policy, instance profile (AWS only), and data access properties for all SQL endpoints of workspace. *Please note that changing parameters of this resources will restart all running SQL endpoints.* To use this resource you need to be an administrator.
8
+
This resource configures the security policy, [databricks_instance_profile](instance_profile.md), and data access properties for all [databricks_sql_endpoint](sql_endpoint.md)of workspace. *Please note that changing parameters of this resources will restart all running [databricks_sql_endpoint](sql_endpoint.md).* To use this resource you need to be an administrator.
The following arguments are supported (see [documentation](https://docs.databricks.com/sql/api/sql-endpoints.html#global-edit) for more details):
25
25
26
26
*`security_policy` (Optional, String) - The policy for controlling access to datasets. Default value: `DATA_ACCESS_CONTROL`, consult documentation for list of possible values
27
-
*`data_access_config` (Optional, Map) - data access configuration for SQL Endpoints, such as configuration for an external Hive metastore, Hadoop Filesystem configuration, etc. Please note that the list of supported configuration properties is limited, so refer to the [documentation](https://docs.databricks.com/sql/admin/data-access-configuration.html#supported-properties) for a full list. Apply will fail if you're specifying not permitted configuration.
28
-
*`instance_profile_arn` (Optional, String) - Instance profile used to access storage from SQL endpoints. Please note that this parameter is only for AWS, and will generate an error if used on other clouds.
27
+
*`data_access_config` (Optional, Map) - data access configuration for [databricks_sql_endpoint](sql_endpoint.md), such as configuration for an external Hive metastore, Hadoop Filesystem configuration, etc. Please note that the list of supported configuration properties is limited, so refer to the [documentation](https://docs.databricks.com/sql/admin/data-access-configuration.html#supported-properties) for a full list. Apply will fail if you're specifying not permitted configuration.
28
+
*`instance_profile_arn` (Optional, String) - [databricks_instance_profile](instance_profile.md)used to access storage from [databricks_sql_endpoint](sql_endpoint.md). Please note that this parameter is only for AWS, and will generate an error if used on other clouds.
0 commit comments