-
Notifications
You must be signed in to change notification settings - Fork 90
Open
Description
Hi team, I'm Haley Won (Sr. Scale Solutions Engineer in Databricks).
Currently, the dbt_sql_job.yml in the Databricks Asset Bundle example uses a new_cluster configuration for the dbt task.
However, with the increasing adoption of serverless and the growing need among users to specify warehouse_id directly for dbt_task, it would be helpful to update the example to reflect this more modern and flexible approach.
Through internal confirmation, we've verified that the following configuration works correctly and aligns with dbt best practices—especially when using warehouse_id:
Use catalog and schema as part of the dbt_task parameters
Avoid using --target in dbt commands when catalog and schema are already provided at the task level
tasks:
- task_key: dbt
environment_key: Default
dbt_task:
catalog: haley_source_ws
schema: default
project_directory: ../
warehouse_id: 61b2ae8974016fcd
commands:
- 'dbt deps'
- 'dbt seed'
- 'dbt run'
environments:
- environment_key: Default
spec:
environment_version: "1"
dependencies:
- dbt-databricks>=1.8.0,<2.0.0Metadata
Metadata
Assignees
Labels
No labels