Skip to content

[Bug] dbt-bigquery incremental strategy captures state prior to pre-hook execution #1564

@brent-schmidtbauer

Description

@brent-schmidtbauer

Is this a new bug?

  • I believe this is a new bug
  • I have searched the existing issues, and I could not find an existing issue for this bug

Which packages are affected?

  • dbt-adapters
  • dbt-tests-adapter
  • dbt-athena
  • dbt-athena-community
  • dbt-bigquery
  • dbt-postgres
  • dbt-redshift
  • dbt-snowflake
  • dbt-spark

Current Behavior

When performing an incremental materialization in bigquery the strategy captures the state of the target relation prior to executing any defined pre-hooks. The capturing of state prior to running the pre-hooks essentially pre-determines what path the code will follow no matter what action is taken during the pre-hooks.
For example, if the target existed at the start of the model run the incremental strategy would perform a merge type operation instead of a CTAS (assuming merge was the strategy). This is the correct/expected behavior. However, if the pre-hook deleted the target relation during its execution the strategy would still attempt to perform a merge. In the latter case this is no longer correct as it should see the target does not exist and perform a CTAS type operation.

Expected Behavior

The expected behavior is that the incremental strategy determines its course of action based on the state of the target after the execution of pre-hooks.

Steps To Reproduce

  1. Create a model configured like the below
{{
    config(
        materialized="incremental",
        incremental_strategy="merge",
        unique_key=["simulated_key"],
        pre_hook="drop table if exists {{ this }}"
    )
}}

select
  extract(minute from current_timestamp) as simulated_key,
  current_timestamp() as execute_dt

  1. Run the model. It will succeed by performing a CTAS type operation.
  2. Run the model again. It will fail as it attempts to perform a merge on a table that no longer exists.

Relevant log output

The sql that ultimately fails on the second attempt from above will be something like:

merge into `bigquery-project`.`bigquery-dataset`.`bigquery-table` as DBT_INTERNAL_DEST
        using (

select
  extract(minute from current_timestamp) as simulated_key,
  current_timestamp() as execute_dt
        ) as DBT_INTERNAL_SOURCE
        on (
                    DBT_INTERNAL_SOURCE.simulated_key = DBT_INTERNAL_DEST.simulated_key
                )
    
    when matched then update set    

    when not matched then insert
        ()
    values
        ()

Environment

- dbt-studio: 2026.1.20+e69724e
- <adapter>: bigquery=1.11.0-post25+b35814c52ef90054ab6fefffab85ed23d68f2bb5

Additional Context

The relevant source code for the bigquery incremental strategy can be found below. Lines 78-80 (or specifically line 79 capturing loaded existing_relation) could be moved to after line 96 to prevent the state from being captured until after the pre_hooks finish running.

https://github.com/dbt-labs/dbt-adapters/blob/main/dbt-bigquery/src/dbt/include/bigquery/macros/materializations/incremental.sql#L78

Metadata

Metadata

Assignees

No one assigned

    Labels

    triage:productIn Product's queuetype:bugSomething isn't working as documented

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions