Skip to content

rows_affected string value from dbt-databricks causes INSERT failure on Databricks Runtime 17.3+ #2089

@asos-dipeshbhundia

Description

@asos-dipeshbhundia

Bug Description

When using dbt-databricks with the +view_update_via_alter: true configuration, views that have no schema changes return an adapter response with rows_affected as the string "-1". Elementary's upload_run_results macro inserts this string value directly into an inline VALUES clause, which fails on Databricks Runtime 17.3+ due to stricter type checking.

Environment

  • dbt version: 1.11.2
  • dbt-databricks version: 1.11.4
  • Databricks SQL Connector version: 4.1.3
  • Databricks Runtime: 17.3+ (stricter type checking introduced)
  • Elementary version: Latest (from dbt_packages/elementary)

Steps to Reproduce

  1. Configure a dbt view model with +view_update_via_alter: true
  2. Run the model twice (first run creates it, second run attempts to alter it)
  3. When there are no schema changes, dbt-databricks returns:
    {
      "_message": "skip `database`.`schema`.`view_name`",
      "code": "skip",
      "rows_affected": "-1"
    }
  4. Elementary's on-run-end hook attempts to insert run results
  5. INSERT fails with type incompatibility error

Error Message

[INVALID_INLINE_TABLE.INCOMPATIBLE_TYPES_IN_INLINE_TABLE] Invalid inline table. 
Found incompatible types in the column `col15` for inline table. 
SQLSTATE: 42000; line 9 pos 9

(Note: col15 corresponds to the rows_affected column in the inline VALUES table)

Example Generated SQL (Simplified)

Elementary generates an INSERT with inline VALUES like this:

insert into dbt_run_results
  (model_execution_id, unique_id, rows_affected, ...)
values
  ('exec-1', 'model.project.model1', 100, ...),        -- Integer: OK
  ('exec-2', 'model.project.model2', NULL, ...),       -- NULL: OK
  ('exec-3', 'model.project.model3', '-1', ...)        -- String '-1': FAILS on DBR 17.3+

The string '-1' cannot be implicitly converted to BIGINT in Databricks Runtime 17.3+, causing the entire INSERT to fail.

Root Cause

In macros/edr/dbt_artifacts/upload_run_results.sql, the flatten_run_result macro extracts rows_affected without type validation or conversion:

{% set flatten_run_result_dict = {
    ...
    'rows_affected': run_result_dict.get('adapter_response', {}).get('rows_affected'),
    ...
} %}

When dbt-databricks returns the string "-1" (instead of integer -1 or NULL), Elementary inserts it as-is. Prior to Databricks Runtime 17.3, implicit string-to-integer conversion was allowed. Starting with 17.3, this conversion is stricter, causing the INSERT to fail.

Suggested Fix

The rows_affected value should be properly cast to ensure type safety. A potential fix could use try_cast in the generated SQL or perform type conversion in the Jinja macro to handle edge cases where adapters return string values instead of integers.

For example, the macro could convert the value to an integer or NULL before insertion, ensuring that string values like "-1" are properly handled. Using try_cast or explicit type conversion in the macro would make the code more robust across different adapters and runtime versions.

This would ensure compatibility with stricter type checking in Databricks Runtime 17.3+ while maintaining backward compatibility with other platforms.

Impact

This bug affects all dbt-databricks users on Runtime 17.3+ who:

  • Use Elementary for data observability
  • Have models configured with +view_update_via_alter: true
  • Run incremental builds where views are skipped due to no changes

The error prevents Elementary from uploading run results, breaking observability workflows.

Additional Context

  • Databricks Runtime 17.3 introduced stricter type checking for inline tables
  • See Databricks error documentation
  • The dbt-databricks adapter returns string "-1" intentionally to signal "no rows affected" for skipped operations
  • Other adapters may return integer -1 or NULL, so the fix should handle all cases gracefully

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions