You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: snippets/general-shared-text/databricks-delta-table-api-placeholders.mdx
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -2,7 +2,7 @@
2
2
-`<server-hostname>` (_required_): The target Databricks cluster's or SQL warehouse's **Server Hostname** value.
3
3
-`<http-path>` (_required_): The cluster's or SQL warehouse's **HTTP Path** value.
4
4
-`<token>` (_required_ for PAT authentication): For Databricks personal access token (PAT) authentication, the target Databricks user's PAT value.
5
-
-`<client-id>` and `<client-secret>` (_required_ for OAuth authentication): For Databricks OAuth machine-to-machine (M2M) authentication, the Databricks managed service principal's **UUID** (client ID or Application ID) and OAuth **Secret** (client secret) values.
5
+
-`<client-id>` and `<client-secret>` (_required_ for OAuth authentication): For Databricks OAuth machine-to-machine (M2M) authentication, the Databricks managed service principal's **UUID** (or **Client ID** or **Application ID**) and OAuth **Secret** (client secret) values.
6
6
-`<catalog>` (_required_): The name of the catalog in Unity Catalog for the target volume and table in the Databricks workspace.
7
7
-`<database>`: The name of the database in Unity Catalog for the target volume and table. The default is `default` if not otherwise specified.
8
8
-`<table-name>` (_required_): The name of the target table in Unity Catalog.
Copy file name to clipboardExpand all lines: snippets/general-shared-text/databricks-delta-table-cli-api.mdx
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -13,7 +13,7 @@ The following environment variables:
13
13
-`DATABRICKS_HOST` - The Databricks cluster's or SQL warehouse's **Server Hostname** value, represented by `--server-hostname` (CLI) or `server_hostname` (Python).
14
14
-`DATABRICKS_HTTP_PATH` - The cluster's or SQL warehouse's **HTTP Path** value, represented by `--http-path` (CLI) or `http_path` (Python).
15
15
-`DATABRICKS_TOKEN` - For Databricks personal access token authentication, the token's value, represented by `--token` (CLI) or `token` (Python).
16
-
-`DATABRICKS_CLIENT_ID` - For Databricks managed service principal authenticaton, the service principal's **UUID** value, represented by `--client-id` (CLI) or `client_id` (Python).
16
+
-`DATABRICKS_CLIENT_ID` - For Databricks managed service principal authenticaton, the service principal's **UUID**(or **Client ID** or **Application ID**) value, represented by `--client-id` (CLI) or `client_id` (Python).
17
17
-`DATABRICKS_CLIENT_SECRET` - For Databricks managed service principal authenticaton, the service principal's OAuth **Secret** value, represented by `--client-secret` (CLI) or `client_secret` (Python).
18
18
-`DATABRICKS_CATALOG` - The name of the catalog in Unity Catalog, represented by `--catalog` (CLI) or `catalog` (Python).
19
19
-`DATABRICKS_DATABASE` - The name of the schema (database) inside of the catalog, represented by `--database` (CLI) or `database` (Python). The default is `default` if not otherwise specified.
Copy file name to clipboardExpand all lines: snippets/general-shared-text/databricks-delta-table-platform.mdx
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,7 +4,7 @@ Fill in the following fields:
4
4
-**Server Hostname** (_required_): The target Databricks cluster's or SQL warehouse's **Server Hostname** value.
5
5
-**HTTP Path** (_required_): The cluster's or SQL warehouse's **HTTP Path** value.
6
6
-**Token** (_required_ for PAT authentication): For Databricks personal access token (PAT) authentication, the target Databricks user's PAT value.
7
-
-**UUID** and **OAuth Secret** (_required_ for OAuth authentication): For Databricks OAuth machine-to-machine (M2M) authentication, the Databricks managed service principal's **UUID** (client ID or Application ID) and OAuth **Secret** (client secret) values.
7
+
-**UUID** and **OAuth Secret** (_required_ for OAuth authentication): For Databricks OAuth machine-to-machine (M2M) authentication, the Databricks managed service principal's **UUID** (or **Client ID** or **Application ID**) and OAuth **Secret** (client secret) values.
8
8
-**Catalog** (_required_): The name of the catalog in Unity Catalog for the target volume and table in the Databricks workspace.
9
9
-**Database**: The name of the database in Unity Catalog for the target volume and table. The default is `default` if not otherwise specified.
10
10
-**Table Name** (_required_): The name of the target table in Unity Catalog.
- The Databricks workspace user or Databricks managed service principal must have the following _minimum_ set of permissions and privileges to write to an
130
+
existing volume or table in Unity Catalog:
131
+
132
+
- To use an all-purpose cluster for access, `Can Restart` permission on that cluster. Learn how to check and set cluster permissions for
-`<name>` (_required_) - A unique name for this connector.
2
2
-`<host>` (_required_) - The Databricks workspace host URL.
3
-
-`<client-id>` (_required_) - The application IDvalue for the Databricks-managed service principal that has access to the volume.
4
-
-`<client-secret>` (_required_) - The associated OAuth secret value for the Databricks-managed service principal that has access to the volume.
3
+
-`<client-id>` (_required_) - The **Client ID** (or **UUID** or **Application ID**) value for the Databricksmanaged service principal that has the appropriate privileges to the volume.
4
+
-`<client-secret>` (_required_) - The associated OAuth **Secret** value for the Databricksmanaged service principal that has the appropriate privileges to the volume.
5
5
-`<catalog>` (_required_) - The name of the catalog to use.
6
6
-`<schema>` - The name of the associated schema. If not specified, `default` is used.
7
7
-`<volume>` (_required_) - The name of the associated volume.
8
-
-`<volume-path>` - Any optional path to access within the volume.
9
-
10
-
To learn how to create a Databricks-managed service principal, get its application ID, and generate an associated OAuth secret,
or [GCP](https://docs.gcp.databricks.com/volumes/utility-commands.html#change-permissions-on-a-volume).
24
-
25
-
9
+
-**Client Secret** (_required_): The associated OAuth **Secret** value for the Databricks managed service principal that has the appropriate privileges to the volume.
10
+
-**Client ID** (_required_): The **Client ID** (or **UUID** or **Application ID**) value for the Databricks managed service principal that has appropriate privileges to the volume.
Copy file name to clipboardExpand all lines: snippets/general-shared-text/databricks-volumes.mdx
+40-24Lines changed: 40 additions & 24 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -10,8 +10,8 @@ allowfullscreen
10
10
11
11
The preceding video shows how to use Databricks personal access tokens (PATs), which are supported only for [Unstructured Ingest](/ingestion/overview).
12
12
13
-
To learn how to use Databricks-managed service principals, which are supported by both the [Unstructured Platform](/platform/overview) and Unstructured Ingest,
14
-
see the additional videos later on this page.
13
+
To learn how to use Databricksmanaged service principals, which are supported by both the [Unstructured Platform](/platform/overview) and Unstructured Ingest,
14
+
see the additional video later on this page.
15
15
16
16
- The Databricks workspace URL. Get the workspace URL for
For the [Unstructured Platform](/platform/overview), only the following Databricks authentication type is supported:
55
-
56
-
- For OAuth machine-to-machine (M2M) authentication (AWS, Azure, and GCP): The client ID and OAuth secret values for the corresponding service principal.
57
-
Note that for Azure, only Databricks-managed service principals are supported. Microsoft Entra ID-managed service principals are not supported.
44
+
For the [Unstructured Platform](/platform/overview), only Databricks OAuth machine-to-machine (M2M) authentication is supported for AWS, Azure, and GCP.
45
+
You will need the the **Client ID** (or **UUID** or **Application** ID) and OAuth **Secret** (client secret) values for the corresponding service principal.
46
+
Note that for Azure, only Databricks managed service principals are supported. Microsoft Entra ID managed service principals are not supported.
58
47
59
48
For [Unstructured Ingest](/ingestion/overview), the following Databricks authentication types are supported:
60
49
@@ -69,10 +58,37 @@ see the additional videos later on this page.
69
58
- For Google Cloud Platform credentials authentication (GCP only): The local path to the corresponding Google Cloud service account's credentials file.
70
59
- For Google Cloud Platform ID authentication (GCP only): The Google Cloud service account's email address.
71
60
72
-
- The Databricks catalog name for the volume. Get the catalog name for [AWS](https://docs.databricks.com/catalogs/manage-catalog.html), [Azure](https://learn.microsoft.com/azure/databricks/catalogs/manage-catalog), or [GCP](https://docs.gcp.databricks.com/catalogs/manage-catalog.html).
73
-
- The Databricks schema name for the volume. Get the schema name for [AWS](https://docs.databricks.com/schemas/manage-schema.html), [Azure](https://learn.microsoft.com/azure/databricks/schemas/manage-schema), or [GCP](https://docs.gcp.databricks.com/schemas/manage-schema.html).
74
-
- The Databricks volume name, and optionally any path in that volume that you want to access directly. Get the volume information for [AWS](https://docs.databricks.com/files/volumes.html), [Azure](https://learn.microsoft.com/azure/databricks/files/volumes), or [GCP](https://docs.gcp.databricks.com/files/volumes.html).
75
-
- Make sure that the target user or service principal has access to the target volume. To learn more, see the documentation for
[Azure](https://learn.microsoft.com/azure/databricks/schemas/create-schema), or
68
+
[GCP](https://docs.gcp.databricks.com/schemas/create-schema.html) for the volume.
69
+
- The name of the volume in Unity Catalog for [AWS](https://docs.databricks.com/tables/managed.html),
70
+
[Azure](https://learn.microsoft.com/azure/databricks/tables/managed), or
71
+
[GCP](https://docs.gcp.databricks.com/tables/managed.html), and optionally any path in that volume that you want to access directly, beginning with the volume's root.
72
+
- The Databricks workspace user or service principal must have the following _minimum_ set of privileges to read from or write to the
73
+
existing volume in Unity Catalog:
74
+
75
+
-`USE CATALOG` on the volume's parent catalog in Unity Catalog.
76
+
-`USE SCHEMA` on the volume's parent schema in Unity Catalog.
77
+
-`READ VOLUME` and `WRITE VOLUME` on the volume.
78
+
79
+
Learn how to check and set Unity Catalog privileges for
0 commit comments