Skip to content

Commit bbe170c

Browse files
docs: Update databricks-jdbc.mdx (#8757) — thanks, @morgan-at-cube!
Update export bucket instructions to use unity catalog. The previous recommendation (DBFS) is no longer supported by Databricks.
1 parent e39b3e5 commit bbe170c

File tree

1 file changed

+7
-7
lines changed

1 file changed

+7
-7
lines changed

docs/pages/product/configuration/data-sources/databricks-jdbc.mdx

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -59,7 +59,7 @@ docker run -it -p 4000:4000 --env-file=.env cube-jdk
5959
| `CUBEJS_DB_DATABRICKS_ACCEPT_POLICY` | Whether or not to accept the license terms for the Databricks JDBC driver | `true`, `false` ||
6060
| `CUBEJS_DB_DATABRICKS_TOKEN` | The [personal access token][databricks-docs-pat] used to authenticate the Databricks connection | A valid token ||
6161
| `CUBEJS_DB_DATABRICKS_CATALOG` | The name of the [Databricks catalog][databricks-catalog] to connect to | A valid catalog name ||
62-
| `CUBEJS_DB_EXPORT_BUCKET_MOUNT_DIR` | The path for the [Databricks DBFS mount][databricks-docs-dbfs] | A valid mount path ||
62+
| `CUBEJS_DB_EXPORT_BUCKET_MOUNT_DIR` | The path for the [Databricks DBFS mount][databricks-docs-dbfs] (Not needed if using Unity Catalog connection) | A valid mount path ||
6363
| `CUBEJS_CONCURRENCY` | The number of concurrent connections each queue has to the database. Default is `2` | A valid number ||
6464
| `CUBEJS_DB_MAX_POOL` | The maximum number of concurrent database connections to pool. Default is `8` | A valid number ||
6565

@@ -103,7 +103,7 @@ Storage][azure-bs] for export bucket functionality.
103103
#### AWS S3
104104

105105
To use AWS S3 as an export bucket, first complete [the Databricks guide on
106-
mounting S3 buckets to Databricks DBFS][databricks-docs-dbfs-s3].
106+
connecting to cloud object storage using Unity Catalog][databricks-docs-uc-s3].
107107

108108
<InfoBox>
109109

@@ -123,7 +123,7 @@ CUBEJS_DB_EXPORT_BUCKET_AWS_REGION=<AWS_REGION>
123123
#### Azure Blob Storage
124124

125125
To use Azure Blob Storage as an export bucket, follow [the Databricks guide on
126-
mounting Azure Blob Storage to Databricks DBFS][databricks-docs-dbfs-azure].
126+
connecting to Azure Data Lake Storage Gen2 and Blob Storage][databricks-docs-azure].
127127

128128
[Retrieve the storage account access key][azure-bs-docs-get-key] from your Azure
129129
account and use as follows:
@@ -152,10 +152,10 @@ bucket][self-preaggs-export-bucket] **must be** configured.
152152
https://docs.microsoft.com/en-us/azure/storage/common/storage-account-keys-manage?toc=%2Fazure%2Fstorage%2Fblobs%2Ftoc.json&tabs=azure-portal#view-account-access-keys
153153
[databricks]: https://databricks.com/
154154
[databricks-docs-dbfs]: https://docs.databricks.com/en/dbfs/mounts.html
155-
[databricks-docs-dbfs-azure]:
156-
https://docs.databricks.com/data/data-sources/azure/azure-storage.html#mount-azure-blob-storage-containers-to-dbfs
157-
[databricks-docs-dbfs-s3]:
158-
https://docs.databricks.com/data/data-sources/aws/amazon-s3.html#access-s3-buckets-through-dbfs
155+
[databricks-docs-azure]:
156+
https://docs.databricks.com/data/data-sources/azure/azure-storage.html
157+
[databricks-docs-uc-s3]:
158+
https://docs.databricks.com/en/connect/unity-catalog/index.html
159159
[databricks-docs-jdbc-url]:
160160
https://docs.databricks.com/integrations/bi/jdbc-odbc-bi.html#get-server-hostname-port-http-path-and-jdbc-url
161161
[databricks-docs-pat]:

0 commit comments

Comments
 (0)