You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
feat: Complete hard_deletes='new_record' implementation for snapshots
Fixes#1176
Implements full support for hard_deletes configuration in snapshot materializations,
enabling users to track deleted source records with dedicated deletion records marked
by dbt_is_deleted=True.
The dbt-core snapshot_staging_table macro generates a deletion_records CTE that
relies on get_column_schema_from_query() for source columns, which returns proper
column schema objects with .name attributes. However, when building the list of
snapshotted_cols from the target table, it used get_columns_in_relation() which
returns agate.Row tuples like ('col_name', 'data_type', 'comment').
The deletion_records CTE tried to iterate these tuples using .name attribute
(via get_list_of_column_names()), which doesn't exist on tuples. This caused
the column matching logic to fail silently, preventing deletion records from
being properly constructed with the correct columns from the snapshotted table.
This resulted in deletion records being inserted with NULL values for all source
columns (id, name, etc.) instead of the actual values from the deleted records,
causing malformed output as reported in issue #1176.
Created databricks__snapshot_staging_table override that properly extracts column
names from agate.Row tuples by accessing index [0] instead of .name attribute.
This ensures the deletion_records CTE receives correct column lists for both
source and target tables, allowing proper column matching when inserting deletion
records.
Additionally, overrode databricks__build_snapshot_table to include dbt_is_deleted
column in initial snapshot table creation when hard_deletes='new_record', ensuring
the column exists from the start and doesn't need to be added later.
**New file: dbt/include/databricks/macros/materializations/snapshot_helpers.sql**
- databricks__build_snapshot_table: Adds dbt_is_deleted column for new_record mode
- databricks__snapshot_staging_table: Complete override to fix column name extraction
- Properly extracts column names from agate.Row tuples using index [0]
- Filters out Databricks metadata rows (starting with '#')
- Generates correct deletion_records CTE with proper column matching
**New file: dbt/include/databricks/macros/materializations/snapshot_merge.sql**
- databricks__snapshot_merge_sql: Implements hard_deletes-aware MERGE logic
- Supports 'invalidate' mode with WHEN NOT MATCHED BY SOURCE clause
- Uses 'insert *' pattern to include all staging table columns including dbt_is_deleted
**New file: tests/functional/adapter/simple_snapshot/test_hard_deletes.py**
- Comprehensive functional tests for all three hard_deletes modes
- TestHardDeleteIgnore: Verifies deleted records remain unchanged (default)
- TestHardDeleteInvalidate: Verifies dbt_valid_to is set for deleted records
- TestHardDeleteNewRecord: Verifies new deletion records with dbt_is_deleted=True
**hard_deletes='ignore'** (default)
- Deleted records remain unchanged in snapshot
- dbt_valid_to stays NULL for records no longer in source
- Maintains backward compatibility
**hard_deletes='invalidate'**
- Deleted records are invalidated by setting dbt_valid_to timestamp
- Uses Delta Lake's WHEN NOT MATCHED BY SOURCE clause
- Original records marked as no longer valid when removed from source
**hard_deletes='new_record'**
- Original records are invalidated (dbt_valid_to set)
- New deletion records inserted with dbt_is_deleted=True and actual source column values
- Provides complete audit trail of deletions
- Resolves malformed output issue where deletion records had NULL values
- All 3 functional tests passing (ignore, invalidate, new_record)
- Code quality checks passing (ruff, ruff-format, mypy)
- No regressions in existing snapshot functionality
- Verified with Databricks Delta Lake MERGE operations
- Tested against Unity Catalog cluster
- dbt/include/databricks/macros/materializations/snapshot_helpers.sql (new, 221 lines)
- dbt/include/databricks/macros/materializations/snapshot_merge.sql (new, 32 lines)
- tests/functional/adapter/simple_snapshot/test_hard_deletes.py (new, 298 lines)
- .gitignore (added docs/plans/ exclusion)
Signed-off-by: Randy Pitcher <[email protected]>
Co-Authored-By: Claude <[email protected]>
Signed-off-by: Randy Pitcher <[email protected]>
0 commit comments