Common issues and solutions.
Symptom: Connection failures, schema errors
Solution:
# Verify environment variables are set
env | grep DUNE_API_KEY
env | grep DUNE_TEAM_NAME
# If not set, export them (bash/zsh)
export DUNE_API_KEY=your_api_key
export DUNE_TEAM_NAME=your_team_name
# Or for fish
set -x DUNE_API_KEY your_api_key
set -x DUNE_TEAM_NAME your_team_name
# If DEV_SCHEMA_SUFFIX is set and you want to disable it:
unset DEV_SCHEMA_SUFFIXSymptom: Package import errors, command not found
Solution:
# Reinstall dependencies
uv sync --reinstall
# Verify installation
uv run dbt --versionSymptom: dbt debug fails, connection errors
Check:
# Run debug to see detailed error
uv run dbt debug
# Verify API key is set
env | grep DUNE_API_KEY
# Check profiles.yml is using correct env vars
cat profiles.yml | grep -A 5 "password:"Common causes:
- Missing or incorrect
DUNE_API_KEY - Environment variables not exported
Symptom: Certificate validation failures
Check: dbt_project.yml has:
flags:
require_certificate_validation: trueAnd profiles.yml has:
cert: trueThese are required and should not be changed.
Symptom: dbt_utils macro errors
Solution:
uv run dbt depsThis installs packages from packages.yml.
Symptom: ref('model_name') fails
Causes:
- Model hasn't been run yet:
uv run dbt run --select model_name - Typo in model name
Check:
# List all models
uv run dbt list
# Check specific model exists
uv run dbt list --select model_nameSymptom: Cannot create/drop tables
Check:
- Using correct target? (
devvsprod) DUNE_TEAM_NAMEmatches your actual team name- API key has correct permissions
Symptom: Model runs but data doesn't update
Causes:
is_incremental()condition blocking all dataunique_keydoesn't match any rows- NULL values in unique key columns
Debug:
# Force full refresh
uv run dbt run --select model_name --full-refresh
# Check compiled SQL
cat target/compiled/dbt_template/models/path/to/model.sqlSymptom: Full refresh fails when schema changes with error:
DELTA_LAKE_BAD_WRITE- "Failed to write Delta Lake transaction log entry"
- "Failed accessing transaction log for table: <table_name>"
- "TrinoException: Unsupported Trino column type"
Cause: This occurs when:
- You set
on_table_exists: replaceconfig on a table model or project-wide config - The model's schema changes (column types, new/removed columns)
- You trigger a full refresh
The replace strategy cannot handle schema changes because of data type mismatches between the existing table schema and the new schema in the Delta Lake transaction log.
Note: By default (when on_table_exists is not configured), dbt-trino uses a temp table strategy: create temp → rename existing to backup → rename temp to final → drop backup. This default strategy handles schema changes properly.
Solution:
You must manually drop the table before running the full refresh.
Option 1: Use the provided Python script
# Drop a single table
uv run python scripts/drop_tables.py --schema your_schema_name --table your_table_name
# Drop with target specification
uv run python scripts/drop_tables.py --schema your_schema_name --table your_table_name --target devOption 2: Use any Trino client Connect to the Dune Trino API endpoint and run:
DROP TABLE IF EXISTS dune.your_schema_name.your_table_name;Then run your full refresh:
uv run dbt run --select model_name --full-refreshPrevention:
on_table_exists: replace only for specific use cases.
The default behavior (temp table strategy) is recommended because:
- It properly handles schema changes
- It avoids Delta Lake transaction log conflicts
Symptom: Cannot query model created by dbt
Solution: Must use dune. catalog prefix:
-- ❌ WRONG
select * from team__tmp_.my_model
-- ✅ CORRECT
select * from dune.team__tmp_.my_modelSymptom: Query fails with memory error
Solutions:
- Add date filters to limit data scanned
- Remove
ORDER BYor addLIMIT - Break into smaller CTEs
- Select only needed columns (no
SELECT *)
Causes:
- NULL values in
unique_keycolumns - Missing
unique_keyconfig incremental_predicatesfiltering out target rows
Solutions:
- Filter NULLs:
where key_column is not null - Add
unique_keyconfig - Remove or adjust
incremental_predicates
Test:
uv run dbt test --select model_nameShould catch with dbt_utils.unique_combination_of_columns test.
Symptom: Incremental updates overwrite with wrong data
Check:
unique_keycorrectly identifies rows- No NULL values in key columns
incremental_strategyappropriate for use case
# See conflicted files
git status
# Resolve conflicts manually, then:
git add .
git commitgit fetch origin
git merge origin/main
# Resolve any conflicts
git pushCheck:
- Files changed are in trigger paths (see
.github/workflows/) - Branch is not in draft mode (for PR workflow)
- Click on failed workflow in Actions tab
- Expand failed step
- Read error message
- Common issues:
- Missing secrets/variables
- Test failures (fix tests, don't skip)
- Connection issues (check API key)
- Check dbt logs:
logs/dbt.log - Run with verbose flag:
uv run dbt run --select model_name --debug - Check compiled SQL:
target/compiled/dbt_template/models/path/to/model.sql - Query result directly in Dune app to verify data
- Getting Started - Initial setup
- Development Workflow - Development process
- Testing - Test requirements