Skip to content

Commit f5b36a2

Browse files
committed
Merge branch 'main' into 1.10.latest
2 parents 2de7616 + 0f50254 commit f5b36a2

File tree

13 files changed

+237
-671
lines changed

13 files changed

+237
-671
lines changed

CHANGELOG.md

Lines changed: 7 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,10 @@
1-
## dbt-databricks 1.10.6 (TBD)
1+
## dbt-databricks 1.10.7 (TBD)
2+
3+
## dbt-databricks 1.10.6 (July 30, 2025)
4+
5+
### Fixes
6+
7+
- Fix bug introduced by the fix for https://github.com/databricks/dbt-databricks/issues/1083. `DESCRIBE TABLE EXTENDED .. AS JSON` is now only used for DBR versions 16.2 and above
28

39
## dbt-databricks 1.10.5 (July 25, 2025)
410

CONTRIBUTING.MD

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -42,7 +42,7 @@ Once all tests pass a maintainer will rebase and merge your change to `main` so
4242

4343
## Developing this repository
4444

45-
See [docs/local-dev.md](docs/local-dev.md).
45+
See [docs/dbt-databricks-dev.md](docs/dbt-databricks-dev.md).
4646

4747
## Code Style
4848

README.md

Lines changed: 17 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -21,10 +21,11 @@ The `dbt-databricks` adapter contains all of the code enabling dbt to work with
2121

2222
- **Easy setup**. No need to install an ODBC driver as the adapter uses pure Python APIs.
2323
- **Open by default**. For example, it uses the the open and performant [Delta](https://delta.io/) table format by default. This has many benefits, including letting you use `MERGE` as the the default incremental materialization strategy.
24-
- **Support for Unity Catalog**. dbt-databricks>=1.1.1 supports the 3-level namespace of Unity Catalog (catalog / schema / relations) so you can organize and secure your data the way you like.
24+
- **Support for Unity Catalog**. dbt-databricks supports the 3-level namespace of Unity Catalog (catalog / schema / relations) so you can organize and secure your data the way you like.
2525
- **Performance**. The adapter generates SQL expressions that are automatically accelerated by the native, vectorized [Photon](https://databricks.com/product/photon) execution engine.
2626

2727
## Choosing between dbt-databricks and dbt-spark
28+
2829
If you are developing a dbt project on Databricks, we recommend using `dbt-databricks` for the reasons noted above.
2930

3031
`dbt-spark` is an actively developed adapter which works with Databricks as well as Apache Spark anywhere it is hosted e.g. on AWS EMR.
@@ -34,11 +35,13 @@ If you are developing a dbt project on Databricks, we recommend using `dbt-datab
3435
### Installation
3536

3637
Install using pip:
38+
3739
```nofmt
3840
pip install dbt-databricks
3941
```
4042

4143
Upgrade to the latest version
44+
4245
```nofmt
4346
pip install --upgrade dbt-databricks
4447
```
@@ -51,21 +54,29 @@ your_profile_name:
5154
outputs:
5255
dev:
5356
type: databricks
54-
catalog: [optional catalog name, if you are using Unity Catalog, only available in dbt-databricks>=1.1.1]
57+
catalog: [optional catalog name, if you are using Unity Catalog]
5558
schema: [database/schema name]
5659
host: [your.databrickshost.com]
5760
http_path: [/sql/your/http/path]
5861
token: [dapiXXXXXXXXXXXXXXXXXXXXXXX]
5962
```
6063

64+
### Documentation
65+
66+
For comprehensive documentation on Databricks-specific features, configurations, and capabilities:
67+
68+
- **[Databricks configurations](https://docs.getdbt.com/reference/resource-configs/databricks-configs)** - Complete reference for all Databricks-specific model configurations, materializations, and incremental strategies
69+
- **[Connect to Databricks](https://docs.getdbt.com/docs/core/connect-data-platform/databricks-setup)** - Setup and authentication guide
70+
6171
### Quick Starts
6272

6373
These following quick starts will get you up and running with the `dbt-databricks` adapter:
64-
- [Developing your first dbt project](https://github.com/databricks/dbt-databricks/blob/main/docs/local-dev.md)
74+
75+
- [Set up your dbt project with Databricks](https://docs.getdbt.com/guides/set-up-your-databricks-dbt-project)
6576
- Using dbt Cloud with Databricks ([Azure](https://docs.microsoft.com/en-us/azure/databricks/integrations/prep/dbt-cloud) | [AWS](https://docs.databricks.com/integrations/prep/dbt-cloud.html))
6677
- [Running dbt production jobs on Databricks Workflows](https://github.com/databricks/dbt-databricks/blob/main/docs/databricks-workflows.md)
6778
- [Using Unity Catalog with dbt-databricks](https://github.com/databricks/dbt-databricks/blob/main/docs/uc.md)
68-
- [Using GitHub Actions for dbt CI/CD on Databricks](https://github.com/databricks/dbt-databricks/blob/main/docs/github-actions.md)
79+
- [Continuous integration in dbt](https://docs.getdbt.com/docs/deploy/continuous-integration)
6980
- [Loading data from S3 into Delta using the databricks_copy_into macro](https://github.com/databricks/dbt-databricks/blob/main/docs/databricks-copy-into-macro-aws.md)
7081
- [Contribute to this repository](CONTRIBUTING.MD)
7182

@@ -77,7 +88,9 @@ The `dbt-databricks` adapter has been tested:
7788
- against `Databricks SQL` and `Databricks runtime releases 9.1 LTS` and later.
7889

7990
### Tips and Tricks
91+
8092
## Choosing compute for a Python model
93+
8194
You can override the compute used for a specific Python model by setting the `http_path` property in model configuration. This can be useful if, for example, you want to run a Python model on an All Purpose cluster, while running SQL models on a SQL Warehouse. Note that this capability is only available for Python models.
8295

8396
```
Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1 +1 @@
1-
version = "1.10.5"
1+
version = "1.10.6"

dbt/adapters/databricks/behaviors/columns.py

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ class GetColumnsBehavior(ABC):
1212
@classmethod
1313
@abstractmethod
1414
def get_columns_in_relation(
15-
cls, adapter: SQLAdapter, relation: DatabricksRelation
15+
cls, adapter: SQLAdapter, relation: DatabricksRelation, use_legacy_logic: bool = False
1616
) -> list[DatabricksColumn]:
1717
pass
1818

@@ -31,9 +31,9 @@ def _get_columns_with_comments(
3131
class GetColumnsByDescribe(GetColumnsBehavior):
3232
@classmethod
3333
def get_columns_in_relation(
34-
cls, adapter: SQLAdapter, relation: DatabricksRelation
34+
cls, adapter: SQLAdapter, relation: DatabricksRelation, use_legacy_logic: bool = False
3535
) -> list[DatabricksColumn]:
36-
if relation.is_hive_metastore():
36+
if use_legacy_logic:
3737
rows = cls._get_columns_with_comments(adapter, relation, "get_columns_comments")
3838
return cls._parse_columns(rows)
3939
else:
@@ -64,10 +64,10 @@ def _parse_columns(cls, rows: list[AttrDict]) -> list[DatabricksColumn]:
6464
class GetColumnsByInformationSchema(GetColumnsByDescribe):
6565
@classmethod
6666
def get_columns_in_relation(
67-
cls, adapter: SQLAdapter, relation: DatabricksRelation
67+
cls, adapter: SQLAdapter, relation: DatabricksRelation, use_legacy_logic: bool = False
6868
) -> list[DatabricksColumn]:
69-
if relation.is_hive_metastore() or not relation.is_delta:
70-
return super().get_columns_in_relation(adapter, relation)
69+
if use_legacy_logic or not relation.is_delta:
70+
return super().get_columns_in_relation(adapter, relation, use_legacy_logic)
7171

7272
rows = cls._get_columns_with_comments(
7373
adapter, relation, "get_columns_comments_via_information_schema"

dbt/adapters/databricks/impl.py

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -480,7 +480,9 @@ def parse_describe_extended( # type: ignore[override]
480480
def get_columns_in_relation( # type: ignore[override]
481481
self, relation: DatabricksRelation
482482
) -> list[DatabricksColumn]:
483-
return self.get_column_behavior.get_columns_in_relation(self, relation)
483+
# Use legacy macros for hive metastore or DBR versions older than 16.2
484+
use_legacy_logic = relation.is_hive_metastore() or self.compare_dbr_version(16, 2) < 0
485+
return self.get_column_behavior.get_columns_in_relation(self, relation, use_legacy_logic)
484486

485487
def _get_updated_relation(
486488
self, relation: DatabricksRelation

docs/databricks-dbt-constraints-vs-model-contracts.md

Lines changed: 0 additions & 123 deletions
This file was deleted.

docs/databricks-merge.md

Lines changed: 0 additions & 107 deletions
This file was deleted.

0 commit comments

Comments
 (0)