Skip to content

List schemas not identifying existing schemasΒ #1260

@tkosciuszek

Description

@tkosciuszek

Describe the bug

When upgrading dbt-databricks from 1.10.10 -> 1.10.11 schemas that already exist in databricks catalog dev are attempting to be recreated. Since the configuration we run uses separate make commands for create_schema to generate schemas in dev and a CI/CD which creates schemas in prod when deployments are pushed to main, we have revoked access for dbt to create schemas.

Steps To Reproduce

Due to the above, we have overridden the default spark__create_schema macro with the following which throws an exception if the schema does not already exist and prompts users to follow our standard procedures for creating the schema.

{% macro spark__create_schema(relation) -%}
  {{ exceptions.raise_compiler_error("Trying to create a schema: " ~ relation ~
    ". Schemas creation and access should be managed in dbt_project.yml.") }}
{%- endmacro %}

When running dbt run <example_dev_model> after upgrading the dbt-databricks adapter to 1.10.11, this causes breaking behavior as the schemas do already exist, however dbt attempts to recreate them anyway.

I have reason to believe that this stems from the PR #1168 as when I rollback only the changes applied from that PR while leaving the rest of the version 1.10.11 intact, my project is able to detect the schemas and avoid attempting to recreate them with a successful run.

To reproduce:

  1. Install relevant dbt-core and dbt-databricks versions
  2. Create spark__create_schema override macro in adapters.sql in the macros directory
  3. Identify a model with a schema which already exists (for us we were using a development catalog/schema)
  4. Attempt to rebuild the model

Expected behavior

The model should be able to run without needing to re-create a schema which has already been created, thus not running into our override exception as the dbt cli should not have rights to create schemas in our project.

Screenshots and log output

08:19:33  Running with dbt=1.10.4
WARNING:thrift.transport.sslcompat:using legacy validation callback
/Users/me/dev/fn/lakehouse/.venv/lib/python3.12/site-packages/pydantic/_internal/_config.py:373: UserWarning: Valid config keys have changed in V2:
* 'allow_population_by_field_name' has been renamed to 'validate_by_name'
  warnings.warn(message, UserWarning)
08:19:34  Registered adapter: databricks=1.10.11
08:19:35  Found 627 models, 772 data tests, 415 sources, 2091 macros, 6 unit tests
08:19:35  
08:19:35  Concurrency: 8 threads (target='development')
08:19:35  
08:19:36  
08:19:36  Finished running  in 0 hours 0 minutes and 0.49 seconds (0.49s).
08:19:36  Encountered an error:
Database Error
  Compilation Error in macro create_schema (macros/adapters/schema.sql)
    Trying to create a schema: dev.`fastned__entities`. Schemas creation and access should be managed in dbt_project.yml.
    
    > in macro spark__create_schema (Z_Foundations/dbt_macros/adapters.sql)
    > called by macro create_schema (macros/adapters/schema.sql)
    > called by macro create_schema (macros/adapters/schema.sql)

System information

The output of dbt --version:

Core:
  - installed: 1.10.4 
  - latest:    1.10.15 - Update available!

Plugins:
  - databricks: 1.10.11 - Update available!
  - spark:      1.9.3   - Up to date!

The operating system you're using:
MacOS Tahoe 26.0.1 (25A362)

The output of python --version:
Python 3.12.11

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions